A short sample of dancing improvisation inside a reactive environment. A kinect sensor captures the movement of a dancer and sends the information to a software that generates graphics. Thus the projected image reacts to the movement of the dancer in real time.
interaction design and development :: peqpez [caterina antonopoulou]
dancing :: eirini tsimpragou
music :: 'temazcal' by javier alvarez from a compilation of kostis konstantopoulos
software :: openframeworks, processing [using the point2line library of Carl Emil Carlsen]
hardware:: kinect sensor
Babel is an interactive audiovisual installation of collaborative narration. It aims to investigate in which way the narrator's viewpoint, his/her experiences, cultural references, perception of reality and intentions can influence the outcome of a narration.
Various video-authors filmed their own interpretation of an initial script and the different versions of the story were saved in a database.
The installation accesses this database and allows the viewer to interact with the projected videos He/she can mix the different versions, creating a real time narration. The current version of the story is projected on the screens and the rest of the versions are placed virtually behind the current video.
The viewer pointing at the screens with the light of a torch can discover gradually the hidden versions and thus influence the narration. The version can be changed independently at every screen of the installation, giving the viewer the opportunity to combine different versions and to establish a dialogue between characters coming from different videos.
The narration and the audiovisual output are created in real-time and have the character of a dialogue between the various authors and the audience. It's a kind of collage of viewpoints, cultural references, videographic styles and languages.
more info:: http://creaciodigital.upf.edu/~i58236/seriallykilled/
created by:: http://peqpez.blogspot.com/
script:: adapted text, written by http://old-boy.blogspot.com/
developed with openFrameworks:: http://www.openframeworks.cc
Prefalll 135 is an interactive audio-visual installation.
It uses the energy of falling water to make watermils rotate and produce sound and graphics.
By opening and closing the taps, the user is controlling the water circuit and defining the parameters of the audiovisual system
Visuals :: Openframeworks+MSAFluids
Sound :: Pure Data
Physical interaction :: photoreflector IR encoder+arduino
by:Rodrigo Carvalho, Katerina Antonoupoulou, Javier Chavarri