Occlusion Handling in Augmented Reality User Interfaces for Robotic Systems

Conference: ISR/ROBOTIK 2010 - ISR 2010 (41st International Symposium on Robotics) and ROBOTIK 2010 (6th German Conference on Robotics)
06/07/2010 - 06/09/2010 at Munich, Germany

Proceedings: ISR/ROBOTIK 2010

Pages: 7Language: englishTyp: PDF

Personal VDE Members are entitled to a 10% discount on this title

Authors:
Sauer, Markus (Zentrum für Telematik e.V., Gerbrunn, Germany)
Leutert,, Florian; Schilling, Klaus (Department of Robotics and Telematics, University of Würzburg, Germany)

Abstract:
When working with Augmented Reality (AR) interfaces, achieving realistic occlusion between real and virtual objects can be critical for certain applications: it is one of the most decisive elements in depth perception of humans. Several techniques for achieving accurate and stable occlusion with moderate computational effort are presented in this paper: first, a model-based approach is shown, that is extended to include the measurements of a spatial sensor for also determining dynamic occlusions with objects that are not in the existing 3D-model. Finally, a technique that does not require any predefined 3D-models at all is introduced using distance information from a 3D time-of-flight camera. This approach also allows leaving the egocentric view of the camera that is providing the pictures for the AR-application to an arbitrary exocentric view, and can be directly employed on a stereo display without actually deploying a real stereo camera.