This is a lecture by Peter Nugent (LBNL - leader, Computational Cosmology Center and Palomar Transient Factory). Computing has played a pivotal role in theoretical astrophysics since the 1950's. However, in the past few years computing resources have been stressed by both observational surveys and computational simulations. Several of the next generation surveys proposed for in the coming decade (Palomar Transient Factory II, Dark Energy Survey, BigBOSS, LSST, etc.) will exacerbate this problem both by the sheer data volume they will produce and the flops required to analyze/simulate this data. Here I will review the current status of some of the present programs (their successes and failures), the demands of future surveys, and the capabilities and limitations that next-generation supercomputing architectures will impose on these efforts. In particular, I will highlight the problems that hybrid architectures, massive data sets and energy requirements/limitations will play over the coming years in this field.

# vimeo.com/38325074 Uploaded 76 Plays / / 0 Comments Watch in Couch Mode

Follow

UC-HiPACC

UC-HiPACC Plus

The University of California High-Perforance Astrocomputing Center (UC-HiPACC) is a consortium of nine UC campuses and three DOE laboratories.

Browse This Channel

Shout Box

Channels are a simple, beautiful way to showcase and watch videos. Browse more Channels. Channels