Title: Digital Theater Productions 'Zigmagora' and 'Cosmos'
Chris Ziegler will show excerpts of two digital theater productions he took part in between March and August 2019.
"Zigmagora — Stage Your City" by the European Theater Lab is a participatory theater play that uses mobile apps, augmented reality and virtual reality to turn the city into a stage. Performed in Germany, France, Norway and Austria at the Ars Electronica Festival, the piece is still touring and will be performed at the end of September in Tiblisi, Georgia.
"Cosmos" is a dance performance that was produced at the German Center for Art and Media ZKM Karlsruhe, based on research at School of Arts, Media and Engineering's Synthesis Center at iStage.
To learn more about both productions, visit movingimages.de.
Chris Ziegler is an assistant professor at Arizona State University in the School of Arts, Media and Engineering.
In 2018, he produced a new dance performance at the Center for Art and Media with Sayaka Kaiwa, Ted Stoffer (dance) and Hugo Paquete (music). Since 2017, he has been a member of the European Theater Lab of the European Theater Convention, where he was head of digital production design for the research production "Stage Your City," which is currently touring across Europe.
Since 1994, he has been working as a designer, programmer and director and performer at the Center for Art and Media, conducting research on new media tools for theater with world-renowned choreographer Bill Forsythe of Ballett Frankfurt, dance company Emio Greco PC, the International Choreographic Arts Center in Amsterdam and others. He has worked at the Bavarian State Opera Munich, Zurich Opera House, Wuppertal Opera, Karlsruhe State Theatre, Konzert Theater Bern and New Opera Vienna. Since 2000, he has directed his own music theatre and dance productions.
Title: Computational Imaging: Beyond the Limits Imposed by Lenses
The lens has long been a central element of cameras, since its early use in the mid-19th century by Niepce, Talbot, and Daguerre. The role of the lens, from the Daguerrotype to modern digital cameras, is to refract light to achieve a one-to-one mapping between a point in the scene and a point on the sensor. This effect enables the sensor to compute a particular two-dimensional (2D) integral of the incident 4D light-field. We propose a radical departure from this practice and the many limitations it imposes. In the talk Ashok Veeraraghavan will focus on two inter-related research projects that attempt to go beyond lens-based imaging.
First, Veeraraghavan will discuss his lab’s recent efforts to build flat, extremely thin imaging devices by replacing the lens in a conventional camera with an amplitude mask and computational reconstruction algorithms. These lensless cameras, called FlatCams, can be less than a millimeter in thickness and enable applications where size, weight, thickness or cost are the driving factors. He'll take a brief detour and study an important problem in neuroscience that his team is currently attempting to solve by exploiting the lensless imaging technology.
Second, he will discuss high-resolution, long-distance imaging using Fourier Ptychography, where the need for a large aperture aberration corrected lens is replaced by a camera array and associated phase retrieval algorithms resulting again in order of magnitude reductions in size, weight and cost.
Finally, Veeraraghavan will spend a few minutes highlighting some of the other challenging imaging problems including looking around the corner, imaging at the speed of light, imaging through scattering media and direct brain-brain communication that they are attempting to solve through this holistic approach of jointly designing sensors, optics and computational algorithms.
Ashok Veeraraghavan is currently an associate professor of electrical and computer engineering at Rice University. Before joining Rice University, he spent three wonderful and fun-filled years as a research scientist at Mitsubishi Electric Research Labs in Cambridge, Massachusetts. He received his bachelor's in electrical engineering from the Indian Institute of Technology, Madras, in 2002 and MS and PhD degrees from the Department of Electrical and Computer Engineering at the University of Maryland, College Park, in 2004 and 2008, respectively. His thesis received the doctoral dissertation award from the Department of Electrical and Computer Engineering at the University of Maryland. His work has won numerous awards including the Hershel. M. Rich Invention Award in 2016 and 2017 and an NSF CAREER award in 2017. He loves playing, talking and pretty much anything to do with the slow and boring but enthralling game of cricket.
Designing New Computational Cameras and Projectors for Physics-Based Imaging and Vision
New computational cameras and projectors form the convergence of optics, electronics and signal processing to extract more information about the visual world around us. In this talk, I will introduce my group's research on new types of imaging devices, systems and algorithms. This includes pixels that can capture 3D information and refocus an image after it has been captured, as well as a new projector-camera system that selectively parses the light transport in a scene, able to visualize multiple bounce light and see through human skin. All this research points to exciting new opportunities for physics-based imaging and vision in our visual computing systems of the future.
Suren Jayasuriya is an assistant professor at Arizona State University, in the School of Arts, Media and Engineering and Electrical, Computer and Energy Engineering. Before this, he was a postdoctoral fellow at the Robotics Institute at Carnegie Mellon University. Suren received his PhD in electrical and computer engineering at Cornell University in January 2017 and graduated from the University of Pittsburgh in 2012 with a BS in mathematics (with departmental honors) and a BA in philosophy. His research interests are in computational imaging/photography, computer vision, sensors and education research. He received the 2013 NSF Graduate Research Fellowship, the 2015 Qualcomm Innovation Fellowship, the 2015 Cornell ECE Outstanding TA award and the best paper at ICCP 2014. Visit his website at web.asu.edu/imaging-lyceum.
This talk will present an overview of current projects at ASU's Social and Digital Systems Group (SANDS). The SANDS Group is a transdisciplinary research collective within the School of Arts, Media, and Engineering. Our work supports democratic participation in science and public engagement with scientific issues. Our research enables community knowledge sharing, artistic expressions, and civic activism that emerge from amateur science work. We develop, deploy, and study low-cost systems for creative science work in contexts such as hackspaces, art studios, citizen science communities, homes, schools, or across social media platforms. This talk will present our projects in DIY biology, interactive material, and thermal aerial sensing.
Dr. Stacey Kuznetsov is an assistant professor at the School of Arts, Media, and Engineering, with a joint appointment at the School of Computing, Informatics, and Decision Systems Engineering (CIDSE). She holds a PhD from the Human-Computer Interaction Institute at Carnegie Mellon University. Earlier, she worked for a small startup company called Google. She received a BA from New York University with a double major in Philosophy and Computer Science. Her research examines low-cost tools and hands-on making for citizen science, community activism, and (DIY)biology.