Our testimony video for the "Prix Ars Electronica Collide@CERN", an international competition for digital artists which is offering a residency programme initiated by the laboratory.

We couldn't make the residency award but, we brainstormed hard for this entry. We would like to share our work with the community so that someone interested may be inspired. Feel free to contact us to discuss or meet anything related.

PROJECT DEFINITION: Experiencing Particle in a Box via Motion Controlled Virtual Reality

AIM: To create an interactive experience that is inspired by core ideas of modern physics.

PROJECT OUTLINE:
I. The main concept should be both comprehensible to the audience and true to underlying physics.
I.A. "Particle in a 2D box" is designated as a suitable candidate:
I.A.1. It demonstrates sub atomic phenomena and the inherent logic of quantum physics such as uncertainty principle and wave-particle duality.
I.A.2. It is explicitly calculable without abstraction.
I.A.3. It is open to artistic enrichment.

II. The experience is designed as either a room or an isolated wall.
II.A. Attendees see a real-time evolving graphic on the wall which is a representation of probability distribution of a particle.
II.B. They interact with the system by reaching out towards it, in which case they take a measurement from a specific location that either:
II.B.1. Collapses the states to that singular location which enables them to catch and hold the particle, and probability wave propagation starts again as soon as they release it elsewhere in the box.
II.B.2. Or, doesn't hit the particle, but the knowledge that particle isn't at that location changes the probability distribution.

III. In order to move forward with the implementation, we laid out the groundwork with a prototype that runs in the game engine Unity 3d which is capable of:
III.A. Simulating probability distribution of a particle/system of particles in 2D box in real-time.
III.B. Customizing additional visual layers with artistic vision such as texture and light FX.
III.C. Real-time sound generation whose wave form directly derived from the probability distribution function.

IV. The future of the project upholds distinct enhancement possibilities.
IV.A. Interactivity of the user will be established tracking arm movements of the user with a Kinect sensor device.
IV.B. The simulation can be enhanced to calculate a pool of different particles & atoms that can also interact with different types of measurement on the user end.

TEAM:
Ugur Engin Deniz
(concept idea, science consultation, implementation, documentation)
vimeo.com/engindeniz

Ahmet Said Kaplan
(art direction, installation, visual fx, 3d modelling)
ahmetsaid.info

Guney Ozsan
(music & generative audio, simulation, programming, interface design)
guneyozsan.com

Special thanks to Yetkin Yilmaz.

Loading more stuff…

Hmm…it looks like things are taking a while to load. Try again?

Loading videos…