Experimentation is a project consisting of use of Kinect, Processing, and Second Life. The project focuses on gestural interface, via which three components (avatar, sound, visual) are brought together in real-time in a mixed-reality performance. The gestural interface via Kinect and Processing programming enable elimination of pre-programmed (pre-animated) and pre-recorded sequences of avatar movements, sound and visual components.

Kinect captures real-life body movement to control avatar movement in Second Life. Kinect is also used for the part of real-life audio-visual performance, for which Kinect captures the real-life imagery of the performer and her surroundings and manipulates it in real-time via programming in Processing. The sound is generated by computer glitch/feedback in real-time also through gestural interface. The audio-visual performance is then streamed into Second Life to be combined with the movement of the avatar in real-time. Because of the gestural interface for the movement of the avatar and incorporation of real-life surroundings, each performance becomes truly unique.

Experimentation is my first real-time mixed-reality virtual performance, and has been made possible with support from HUMlab, Umeå University, Sweden. This machinima was made of documentation materials taken in HUMlab's H3 location. Second Life location at HUMlab sim.

concept, image, sound, programming and performance: Sachiko Hayashi

Loading more stuff…

Hmm…it looks like things are taking a while to load. Try again?

Loading videos…