An Overview of Uncertainty Quantification for Extreme Scale Science and Engineering
Clayton G. Webster, Oak Ridge National Laboratory1
Our modern treatment of predicting the behavior of physical and engineering problems relies on mathematical modeling followed by computer simulation. However, uncertainties are ubiquitous in all modeling efforts, which affects our predictions and understanding of complex phenomena. Examples include both epistemic (lack of knowledge) and aleatoric (intrinsic variability) uncertainties and en- compass uncertainty coming from inaccurate physical measurements, bias in mathematical descriptions, as well as errors coming from numerical approximations of computational simulations. Because it is essential for dealing with realistic experimental data and assessing the reliability of predictions based on numerical simulations, advanced mathematical and statistical research in uncertainty quantification (UQ) ultimately aims to address these challenges. More importantly, such foundational work is crit- ical to realizing the potential of future computing platforms, including exascale, and will ultimately enable next-generation, complex, predictive simulations of applications of national interest. Applica- tions that dominate the focus of the Department of Energy’s (DOE) mission include enhancement of reliability of smart energy grids, development of renewable energy technologies, vulnerability analysis of water and power supplies, understanding complex biological networks, climate change impacts, and safe, cost-effective designs of current and future energy storage devices.
As the complexity of these systems increase, scientists and engineers are relied upon to provide expert analysis and to inform decision-makers concerning the behavior of, and more importantly to assess the risk associated with, predictive simulations. However, numerous changes in scientific computing at extreme scales are expected to challenge the current UQ paradigm, wherein the stochastic loop is typically wrapped around a black-box simulation. Expected decreases in single-core performance and memory per core, massive increases in the number of cores, and the emergence of novel accelerator-based architectures elucidate the fact that it is crucial that new methodologies are developed for integrating uncertainty analysis into computational simulations. Accomplishing this goal requires mathematical and statistical analysis of innovative massively scalable stochastic algorithms. This talk will survey several novel paradigms in applied mathematics, statistics and computational science, developed by the DOE Institute entitled Environment for Quantifying Uncertainty: Integrated aNd Optimized at the eXtreme scale (EQUINOX)2, aimed at addressing several challenges which arise when applying UQ methodologies to the DOE mission science areas listed above, including:
• Detection and quantification of high-dimensional stochastic QoIs with a specified certainty;
• Reducing the computational burden required to perform rigorous UQ;
• Efficient strategies for UQ that exploit greater levels of parallelism provided by emerging many-core architectures; and
• Systematic assimilation of the uncertainty in measured data for correcting model bias, calibrating parameter interrelations and improving confidence in predicted responses.
1The Oak Ridge National Laboratory is operated by UT-Battelle, LLC, for the US Department of Energy under Contract DE-AC05-00OR22725.