k-TR Theory for Balance of Nature and Nurture in Robotic Perception
Conference: ROBOTIK 2012 - 7th German Conference on Robotics
05/21/2012 - 05/22/2012 at Munich, Germany
Proceedings: ROBOTIK 2012
Pages: 5Language: englishTyp: PDFPersonal VDE Members are entitled to a 10% discount on this title
Mahesh Varadarajan, Karthik; Vincze, Markus (Technical University of Vienna, Austria)
There has been a long standing debate on the role of nature and nurture in the design of algorithms for domestic and humanoid robots. While the quintessential cognitive robot requires a balance of both these attributes for successful learning and operation in domestic environments, the boundary between these modalities has been unclear. Robots that start with a predefined set of algorithms are inflexible in adaptation to the environment of interest, while those that depend on external resources such as the Internet require real-time processing of extensive information in a context sensitive manner. Furthermore, robots using extrinsic information still require a basic set of algorithm primitives to initiate the external query process. They are also restricted in terms of the modalities of data that can be accessed and the structure and content of the selected data. In the case of visual data definitions for perception, most frameworks for domestic robots employ exhaustive 2D/3D object models coupled with semantic web based reasoners to recognize the object of interest for further processing. Alternatively, robots are subjected to a training phase, wherein human intervention is necessary in order to demonstrate or tutor the robot on objects of interest in the given environment. These approaches are neither efficient nor scalable given the wide range of objects that a robot might come in contact with. In this paper, we present an alternate approach to knowledge representation in robots through a combination of both ‘nature’ and ‘nurture’ algorithmic modules. This approach is based on the k-TR theory to visual perception, which is an attempt at explaining visual perception and object recognition through affordances. This evolutionary psychophysics theory has been justified through linguistic, neurobiological and psychophysical priming studies. We also demonstrate a schema for symbolic cognitive architectures to represent visual information in robots.