Poizner Lab, Institute for Neural Computation, University of California San Diego, CA
By combining motion capture and high density electroencephalographic recordings (EEG), our goal is to identify neural and cognitive processes underlying natural exploration, movement, as well as spatial and object memory.
The subject actively explores the environment on a virtual aircraft carrier deck provided by a head-mounted display (HMD) with a total of 12 monitors. This allows for a highly immersive experience.
WorldViz, Inc.'s Vizard software platform was used to create the virtual environment and control the subject's interactions with that environment. The subject's skeleton was mapped to that of the computer avatar using Motion Builder, which computed the inverse kinematics of the subject's movement, passed the inverse kinematics to Vizard which then rendered the animated character in the virtual environment.
Sensics, Inc. xSight panoramic head-mounted display was used for immersive display of the virtual environment.
A 24 camera PhaseSpace, Inc. Impulse system was used to monitor the subject's movement in 3D. Movements of the limbs, head, and body were recorded at 240Hz.
A WorldViz's Intersense inertial sensor was mounted on the head-mounted display to record head orientation at 180Hz.
EEG was registered from a 64-channel array (active electrode system), and was analyzed with CARTOOL (http://sites.google.com/site/fbmlab/home).
For further information please contact Dr. Howard Poizner (firstname.lastname@example.org).