1. ERC-Grant Preisträger Prof. Dr. Dr. Fabian Theis

    02:52

    from Helmholtz Zentrum München / Added

    3 Plays / / 0 Comments

    Fabian Theis hat Mathematik und Physik an der Universität Regensburg studiert und in den Fächern Mathematik und Informatik promoviert. Mit 32 Jahren habilitierte er an der Universität Regensburg. Heute leitet der Jungforscher die Nachwuchsgruppe „Computational Modeling in Biology" am Helmholtz Zentrum München. Zusammen mit seinem Team widmet er sich besonders der Biostatistik, Signalverarbeitung sowie der biomedizinischen Datenanalyse. Für seine herausragenden Erfolge bekam Theis im Jahr 2010 den ERC-Starting Grant -- eine Finanzspritze des Talentförderprogramms der EU, die junge Spitzenforscher mit jeweils bis zu1,5 Millionen Euro beim Aufbau eines unabhängigen Forscherteams unterstützt.

    + More details
    • Accessing Complexity

      04:11

      from Scientific Computing Laboratory / Added

      2 Plays / / 0 Comments

      Prof. Aleksandar Bogojevic focuses on the science of complexity as the joint component tying in all the research done at SCL. (Video by Romana Vujasinovic) www.scl.rs

      + More details
      • Striving for Excellence

        05:11

        from Scientific Computing Laboratory / Added

        1 Play / / 0 Comments

        Prof. Aleksandar Belic talks about cutting edge research and the role of SCL as an EU center of excellence in the field of computer modeling of complex systems and the application of Grid technologies. (Video by Romana Vujasinovic) www.scl.rs

        + More details
        • Particle-based Sampling and Meshing of Surfaces in Multimaterial Volumes

          15:47

          from VGTCommunity / Added

          9 Plays / / 0 Comments

          AUTHORS: Miriah Meyer, Ross Whitaker, Robert M. Kirby, Christian Ledergerber, Hanspeter Pfister ABSTRACT: Methods that faithfully and robustly capture the geometry of complex material interfaces in labeled volume data are important for generating realistic and accurate visualizations and simulations of real-world objects. The generation of such multimaterial models from measured data poses two unique challenges: first, the surfaces must be well-sampled with regular, efficient tessellations that are consistent across material boundaries; and second, the resulting meshes must respect the nonmanifold geometry of the multimaterial interfaces. This paper proposes a strategy for sampling and meshing multimaterial volumes using dynamic particle systems, including a novel, differentiable representation of the material junctions that allows the particle system to explicitly sample corners, edges, and surfaces of material intersections. The distributions of particles are controlled by fundamental sampling constraints, allowing Delaunay-based meshing algorithms to reliably extract watertight meshes of consistently high-quality.

          + More details
          • PIMS / CSC Distinguished Speaker Series: "Beyond the digital divide: Ten good reasons for using splines"

            01:21:37

            from The IRMACS Centre / Added

            April 9, 2010 Dr. Michael Unser ' "Think analog, act digital" is a motto that is relevant to scientific computing and algorithm design in a variety of disciplines, including numerical analysis, image/signal processing, and computer graphics. Here, we will argue that cardinal splines constitute a theoretical and computational framework that is ideally matched to this philosophy, especially when the data is available on a uniform grid. We show that multidimensional spline interpolation or approximation can be performed most efficiently using recursive digital filtering techniques. We highlight a number of "optimal" aspects of splines (in particular, polynomial ones) and discuss fundamental relations with: (1) Shannon's sampling theory, (2) linear system theory, (3) wavelet theory, (4) regularization theory, (5) estimation theory, and (6) stochastic processes (in particular, fractals). The practicality of the spline framework is illustrated with concrete image processing examples; these include derivative-based feature extraction, high-quality rotation and scaling, and (rigid body or elastic) image registration.

            + More details
            • Longitudinal Analysis

              03:47

              from solarpower / Added

              16 Plays / / 0 Comments

              The ongoing research at the Scientific Computing and Imaging Institute's Utah Center for Neuroimaging Analysis reflects the importance of longitudinal research in revealing normative and atypical brain development. Researchers at UCNIA are building a knowledge base of how the brain typically develops, on the molecular, neuroanatomical, and functional levels. This neuroscience research continues to define, in increasing detail, the circuitry connecting brain areas and their potential roles in emotional regulation, social function, and cognition. By developing new intervention strategies targeting developmental trajectories, ultimately, the goal of research at UCNIA is to prevent mental illness from developing and to have early and effective intervention strategies that are adaptable for each patient. Video written, directed, produced, and engineered by Chems Touati.

              + More details
              • HDF5 is for Lovers

                47:05

                from PyData / Added

                561 Plays / / 0 Comments

                Slides can be found here: http://www.slideshare.net/PyData/hdf5-isforlovers HDF5 is a hierarchical, binary database format that has become a de facto standard for scientific computing. While the specification may be used in a relatively simple way (persistence of static arrays) it also supports several high-level features that prove invaluable. These include chunking, ragged data, extensible data, parallel I/O, compression, complex selection, and in-core calculations. Moreover, HDF5 bindings exist for almost every language - including two Python libraries (PyTables and h5py). This tutorial will discuss tools, strategies, and hacks for really squeezing every ounce of performance out of HDF5 in new or existing projects. It will also go over fundamental limitations in the specification and provide creative and subtle strategies for getting around them. Overall, this tutorial will show how HDF5 plays nicely with all parts of an application making the code and data both faster and smaller. With such powerful features at the developer's disposal, what is not to love?! This tutorial is targeted at a more advanced audience which has a prior knowledge of Python and NumPy. Knowledge of C or C++ and basic HDF5 is recommended but not required. This tutorial will require Python 2.7, IPython 0.12+, NumPy 1.5+, and PyTables 2.3+. ViTables and MatPlotLib are also recommended. These may all be found in Linux package managers. They are also available through EPD or easy_install. ViTables may need to be installed independently.

                + More details
                • IPython: A Modern Vision of Interactive Computing

                  01:00:20

                  from PyData / Added

                  1,052 Plays / / 0 Comments

                  IPython has evolved from an enhanced interactive shell into a large and fairly complex set of components that include a graphical Qt-based console, a parallel computing framework and a web-based notebook interface. All of these seemingly disparate tools actually serve a unified vision of interactive computing that covers everything from one-off exploratory codes to the production of entire books made from live computational documents. In this talk I will attempt to show how these ideas form a coherent whole and how they are represented in IPython's codebase. I will also discuss the evolution of the project, attempting to draw some lessons from the last decade as we plan for the future of scientific computing and data analysis. Slides can be found here: http://www.slideshare.net/PyData/ipython-a-modern-vision-of-interactive-computing-pydata-sv-2013

                  + More details
                  • Introduction to Microsoft C++ Accelerated Massive Parallelism—Don McCrady

                    01:07:08

                    from NWCPP: Northwest C++ Users Group / Added

                    119 Plays / / 0 Comments

                    Slides: http://nwcpp.org/may-2012.html Abstract: Microsoft’s C++ AMP (Accelerated Massive Parallelism) is a C++ programming model and language extension that allows any C++ developer to take advantage of the massive parallelism available in today’s GPU’s. It offers a developer-friendly, lightweight, and portable interface that can enable programmers to achieve impressive performance boosts on a variety of hardware platforms. As an integrated part of Visual C++, it is also supported by a full suite of familiar developer tools including the editor, debugger, and profiler. This presentation will introduce C++ AMP and talk about its projected future as the hardware ecosystem evolves. Bio: Don McCrady is the development lead for the C++ AMP project. He has worked for Microsoft for nearly 15 years in many diverse areas including COM services, workflow services, and concurrency programming models. In his spare time, he is an avid (if highly frustrated) astrophotographer and amateur astronomer.

                    + More details

                    What are Tags?

                    Tags

                    Tags are keywords that describe videos. For example, a video of your Hawaiian vacation might be tagged with "Hawaii," "beach," "surfing," and "sunburn."