1. Ball Jam is a volumetric audio composition created by Zack Settel, combining motion tracking, realtime audio processing, and physical modelling in a virtual environment.

    Commissioned and performed by the Quasar Saxophone Quartet (http://www.quasar4.com), the piece allows players to capture samples from their instruments and apply effects based on their location on stage.

    This video shows one of the premiere performances, at the Montréal New Music Festival. Feb 20, 2011.

    # vimeo.com/20437925 Uploaded 620 Plays 1 Comment
  2. Impressionable Environments explores the use of attractive and repulsive forces for expressive musical control. In this initial study, we have added physically modelled behaviours to the SPIN Framework (http://www.spinframework.org) in order to naturally move sound sources around a 3D scene. The user simply "blows" to create a directional force in the environment. A Ubisense tracking system allows users to move around and direct their forces in any direction.

    This work was realized at the Volumetric Audio Lab:

    Conception and sonic arts: Zack Settel
    Science: Mike Wozniewski
    Collaboration: Luc Courchesne

    # vimeo.com/20435777 Uploaded 89 Plays 0 Comments
  3. Audio Graffiti is a multi-user sound/music installation that explores new modes of sonic interaction, afforded by the latest in locative technologies. Several mobile users may create and explore a gradually evolving collage of audio graffiti.

    This video shows a preliminary trial of the installation, filmed at the International Computer Music Conference (Montreal, Canada, Aug 21 2009).

    For a more recent version, see: vimeo.com/10986957


    Zack Settel (http://sheefa.net/zack/portfolio)
    Mike Wozniewski (http://www.mikewoz.com)

    # vimeo.com/10988116 Uploaded 335 Plays 0 Comments
  4. Audio Graffiti is a multi-user sound/music installation that explores new modes of sonic interaction, afforded by the latest in locative technologies. Several mobile users may create and explore a gradually evolving collage of audio graffiti.

    The piece can be deployed in an outdoor environment (using GPS tracking), or in an indoor space as seen in this video. Equipped with a wireless headset and tracking device, participants can "tag" or "spray" sound on to a real physical wall. We provide several small musical instruments, which can be used along with one's voice, to add sounds to the collaborative musical mix. The installation is seeded with some pre-existing sonic material, which allows participants to synchronize rhythmically, and maintains cohesion over time. All user-contributed sounds slowly fade away, resulting in an ever-evolving musical piece.

    As users moves about, they also experience a changing sonic perspective of the localized sounds, based on their particular location. Thus, users not only create the audio content, but they also participate actively in the encounter (remixing) of sonic material. Participants who are waiting their turn in the staging area may watch a real-time 3D visualization of the installation, which shows avatars of each player walking amongst virtual sound sources.

    This installation was filmed at the 12th Biennial Arts and Technology Symposium in the lobby of the Ammerman Center for Arts and Technology (Connecticut College, March 4-6, 2010).


    Zack Settel (http://sheefa.net/zack/portfolio)
    Mike Wozniewski (http://www.mikewoz.com)

    # vimeo.com/10986957 Uploaded 2,195 Plays 0 Comments
  5. This is a demonstration of the Clickable Space authoring suite, developed by Mike Wozniewski and Zack Settel at the Society for Arts and Technology in Montreal, Canada.

    The software allows for the creation of spatial interfaces, where interaction is based on the relative movement of 3D content, and the resulting geometric events (e.g. intersection, relative distance and incidence) which occur. A user is represented in the interaction scene by an avatar, which is controlled via standard input devices (eg, keyboard, mouse, joystick, Wii controllers, etc) or via realtime tracking systems. Thus, a user's (avatar) movement in the scene relative to other content will generate geometric events. These events, can be mapped on to actions, which are in turn, sent to systems to control sound, lighting, graphics or other.

    In a typical application for live performance, a motion-tracked performer will interact with an overlaid virtual 3D interaction scene that corresponds to the exact dimensions of the performance space. As the performer moves about the stage, his/her avatar encounters the various geometric content defined in the scene, and various actions are triggered; his/her use of a wireless controller adds the modality of "clicking", thus offering a wider range possible event-to-action mappings.

    The system operates in a distributed fashion over IP networks, supporting applications that can be shared between multiple users in different locations.

    # vimeo.com/4752135 Uploaded 532 Plays 0 Comments


Mike Wozniewski

Videos of art projects, installations, and tutorials related to the SPIN Framework: http://www.spinframework.org

Browse This Channel

Shout Box

Heads up: the shoutbox will be retiring soon. It’s tired of working, and can’t wait to relax. You can still send a message to the channel owner, though!

Channels are a simple, beautiful way to showcase and watch videos. Browse more Channels.