Scene Representation for Anthropomorphic Robots: A Dynamic Neural Field Approach

Conference: ISR/ROBOTIK 2010 - ISR 2010 (41st International Symposium on Robotics) and ROBOTIK 2010 (6th German Conference on Robotics)
06/07/2010 - 06/09/2010 at Munich, Germany

Proceedings: ISR/ROBOTIK 2010

Pages: 7Language: englishTyp: PDF

Personal VDE Members are entitled to a 10% discount on this title

Authors:
Zibner, Stephan K. U.; Faubel, Christian; Iossifidis, Ioannis; Schöner, Gregor (Institut für Neuroinformatik, Ruhr-Universität Bochum, Germany)

Abstract:
For autonomous robotic systems, the ability to represent a scene, to memorize and track objects and their associated features is a prerequisite for reasonable interactive behavior. In this paper, we present a biologically inspired architecture for scene representation that is based on Dynamic Field Theory. At the core of the architecture we make use of three-dimensional Dynamic Neural Fields for representing space-feature associations. These associations are built up autonomously in a sequential way and they are maintained and continuously updated. We demonstrate these capabilities in two experiments on an anthropomorphic robotic platform. In the first experiment we show the sequential scanning of a scene. The second experiment demonstrates the maintenance of associations for objects, which get out of view, and the correct update of the scene representation, if such objects are removed.