Augmented reality (AR) constitutes a very powerful three-dimensional user interface for many "hands-on" application scenarios in which users cannot sit at a conventional desktop computer. To fully exploit the AR paradigm, the computer must not only augment the real world, it also has to accept feedback from it. Such feedback is typically collected via gesture languages, 3D pointers, or speech input - all tools which expect users to communicate with the computer about their work at a meta-level rather than just letting them pursue their task. When the computer is capable of deducing progress directly from changes in the real world, the need for special abstract communication interfaces can be reduced or even eliminated. In this paper, we present an optical approach for analyzing and tracking users and the objects they work with. In contrast to emerging workbench and metaDESK approaches, our system can be set up in any room after quickly placing a few known optical targets in the scene. We present three demonstration scenarios to illustrate the overall concept and potential of our approach and then discuss the research issues involved.
«
Augmented reality (AR) constitutes a very powerful three-dimensional user interface for many "hands-on" application scenarios in which users cannot sit at a conventional desktop computer. To fully exploit the AR paradigm, the computer must not only augment the real world, it also has to accept feedback from it. Such feedback is typically collected via gesture languages, 3D pointers, or speech input - all tools which expect users to communicate with the computer about their work at a meta-level...
»