Update: I've released the code and documentation for the instrument. Please check here: http://bit.ly/cfbEPG
TouchNet is a gesture based instrument that is operated from a multi-touch enabled surface. In essence, it is a sampler instrument which converts sounds into square images and the performer plays rectangular portions from these images with touch. The images derived from the sounds give visual feedback to the performer about where to touch on the surface and what portion of the sound to use as a source. Unlike the standard waveform visualization, the performer is able to visually estimate the spectral characteristics of a sound and its change over time from the created image. TouchNet allows usage of several layers of sounds to be controlled and played back simultaneously and borrows the idea of “gesture recording” that is also present in my deQuencher instrument. The recorded gestures can be post processed to be slowed down and sped up dynamically.
In the underlying system I’ve developed, the performer can supply his/her own samples to use within a performance. TouchNet however, wraps this system to use samples downloaded from the FreeSound database prior to a performance. The performer supplies some keywords and the instrument connects to the FreeSound database over the Internet, does a search per keyword, then selects and downloads samples based on given criteria after interpreting the results. The samples are then converted to images, loaded to the instrument in layers and the system becomes ready for a performance…
TouchNet uses my MultiTouchPad and FreeSound Quarks for SuperCollider. I'll be releasing the instrument in the following days, needs some more documentation.