so this is a video clip of a project i have been working for on the ipad. the idea is to have re-usable 3d gui elements that can be used for building native iOS OSC/MIDI controllers. no need to download a 2D GUI builder for your computer. just add faders, keyboards, sequencers, buttons, sliders, toggles etc... to the screen and then move them around to where you want them in your virtual iOS studio control space.
zoom pan and tilt the camera to get the best possible interactive perspective. for instance i found the keyboard easier to play slightly tilted. this 3D gui system actually gives you a more natural and life like feeling than you would normally get from a real life controller. even though there is no physical or haptic response to the iOS device these three-d interface elements defiantly feel as real as they are going to get in the virtual world.
i have no idea why ?; but possibly it has something to do with your cross-model perceptions being inherently linked together. let me explain what i mean by this using the McGurk effect as an example. in the McGurk effect you presented a visual element of lips speaking a particular phoneme lets say the lips are speaking "dah" but the audio is really playing the phoneme "bah" what you actually perceive the audio visual combination as in your head is "gah" so whats so interesting about this example is it proves there is a cross-modal connection between your hearing and your vision.
so im postulating that for me this experience is very similar to this phenomenon, but its combining your vision, touch, and hearing all into a singular cross-modal experience. now i suppose this experience only works if you know what it is like to play a real piano as you would have no reference to make a comparative juxtaposition.
i will be making these 3d gui's available to unity developers possibly for a charge but there will be community versions of them in quartz composer.
Loading more stuff…
Hmm…it looks like things are taking a while to load. Try again?