This is a lecture by Peter Nugent (LBNL - leader, Computational Cosmology Center and Palomar Transient Factory). Computing has played a pivotal role in theoretical astrophysics since the 1950's. However, in the past few years computing resources have been stressed by both observational surveys and computational simulations. Several of the next generation surveys proposed for in the coming decade (Palomar Transient Factory II, Dark Energy Survey, BigBOSS, LSST, etc.) will exacerbate this problem both by the sheer data volume they will produce and the flops required to analyze/simulate this data. Here I will review the current status of some of the present programs (their successes and failures), the demands of future surveys, and the capabilities and limitations that next-generation supercomputing architectures will impose on these efforts. In particular, I will highlight the problems that hybrid architectures, massive data sets and energy requirements/limitations will play over the coming years in this field.

# Uploaded 92 Plays 0 Comments



The University of California High-Perforance Astrocomputing Center (UC-HiPACC) is a consortium of nine UC campuses and three DOE laboratories.

Browse This Channel

Shout Box

Heads up: the shoutbox will be retiring soon. It’s tired of working, and can’t wait to relax. You can still send a message to the channel owner, though!

Channels are a simple, beautiful way to showcase and watch videos. Browse more Channels.