Low-latency teleoperation refers to highly interactive teleoperation in which human operators can control a remote robot in a temporally continuous way, much as they would control their arm or overall body
position. Depending upon the application, this control regime can also include control in which an alternative move-and-wait strategy might be used, provided that the operators do not engage in substantial strategic movement planning during the wait for the robot to catch up with their commands.

There are three factors that influence user difficulty during such low-latency operation: temporal, geometric, and procedural. The research to be discussed in this presentation primarily focusses on the consequences of the geometric misalignment of display and control axes, as occur when the remote camera used for teleoperation is misaligned with respect to the user's torso-referenced view. However, some studies of the interaction of latency and control difficulty due to the misalignment will be included since they demonstrate an interesting multiplicative interaction between these two factors.

We find that a constant-angular-targeting-error model may be used to model the trajectories of some of the targeting hand movements we have studied. This model is similar to an equiaxial spiral model discussed by Rushton and colleagues for human walking and by Tucker with respect to the diving trajectory of raptors,the bird of prey. These models point to a natural measure of the misalignment phenomenon yielding an explanation for the effects of rotations < ~65° about cardinal axes . However, deeper analysis will be needed to explain the effects of larger rotations. Our recent experimentation studying the most general type of 3D motions also reveals some interesting spatial anisotropies that probably reflect the way three dimensional movements are internally represented. In general the effect of the display control misalignment may be described by what we call the Misalignment Disturbance Function (MDF). Future research will be directed to explaining the specific features of this function.

Hand movement of a virtual blue spherical cursor to touch a virtual green spherical target under conditions of misalignment between display and control coordinates. This problem arises during teleoperation with the remote camera is not properly aligned.


Stephen R. Ellis has headed the Advanced Displays and Spatial Perception Laboratory at the NASA Ames Research Center. He received an A.B. in Behavioral Science from U.C. Berkeley (1969), a Ph.D. in Psychology (1974) from McGill University and has had postdoctoral fellowships in Physiological optics and Bioengineering at Brown University and at U.C. Berkeley respectively. He has published over 200 journal publications and reports on user interaction with spatial information and has been in the forefront of the introduction of perspective and 3D formats into user interfaces for aerospace applications. More recently he has pursued the study of virtual environments as user interfaces and as scientific instruments. He has served on the editorial boards of Presence, Virtual Reality and Human Factors and has edited a book, Pictorial communication in virtual and real environments, (Taylor and Francis, London, 2nd Ed., 1993).

Loading more stuff…

Hmm…it looks like things are taking a while to load. Try again?

Loading videos…