Authors: David López, Lora Oehlberg, Candemir Doger, Tobias Isenberg
Abstract: Effective visualization and interaction with 3D datasets that represent scans or simulations of the real world is at the heart of visualization research. Exploration and analysis are most strongly supported when both the best possible visual representations and the best possible interaction techniques are chosen. To present visual representations to users, the use of stereoscopy facilitates depth perception and thus high visual immersion. To facilitate an interactive exploration of data, the use of touch-based input for visualization provides high immersion through interaction due to its directness---the input and the affected data are at the same visual location (i.e., sticky interaction), resulting in users feeling ""in control of the data"". \ \ Unfortunately, these two ways of achieving immersion are mutually exclusive. On the one hand, virtual objects in stereoscopic settings cannot be touched since they appear to float in empty space. Touch interaction, on the other hand, conflicts with stereoscopic display due to parallax issues as well as touch-through and invisible wall problems---it is far better suited for monoscopic displays. \ \ Our overall vision is to enable researchers to explore 3D datasets with as much immersion as possible, arising both from visuals as well as from interaction. We therefore explore ways to combine an immersive large view of the 3D data with means to intuitively control this view with touch input on a separate mobile monoscopic tablet. This combination has the potential to increase people's acceptance of stereoscopic environments for 3D data visualization since---through touch-based interaction---it puts them in control of their data. Moreover, the indirect manipulation of (stereoscopically displayed) 3D data on a personal touch device has been shown to have potentially more efficient and precise interaction than interaction directly on a large display.