Dynamic behaviour conception for EmI companion robot

Conference: ISR/ROBOTIK 2010 - ISR 2010 (41st International Symposium on Robotics) and ROBOTIK 2010 (6th German Conference on Robotics)
06/07/2010 - 06/09/2010 at Munich, Germany

Proceedings: ISR/ROBOTIK 2010

Pages: 8Language: englishTyp: PDF

Personal VDE Members are entitled to a 10% discount on this title

Authors:
Saint-Aimé, Sébastien; Jost, Céline; Le-Pévédic, Brigitte; Duhaut, Dominique (Valoria, University of Bretagne Sud, Vannes, France)

Abstract:
This article presents research work done in the domain of nonverbal emotional interaction for the EmotiRob project. It is a component of the MAPH project, the objective of which is to give comfort to vulnerable children and/or those undergoing long-term hospitalisation through the help of an emotional robot companion. It is important to note that we are not trying to reproduce human emotion and behavior, but trying to make a robot emotionally expressive. The studies carried out on perception and emotional synthesis have allowed us to develop our emotional model of interaction: iGrace. iGrace actually allow a system display emotions as a static mode. This mode has ever been evaluated with our avatar: ArtE, we developped in Flash. The rate of satisfaction (86%) of the evaluation allowed us embeded static mode on our robotics platform: EmI - Emotional Model of Interaction. Now, we want add dynamics on EmI robot and add lifelike in its reactions. This paper will present different hypothesis we used for iGrace emotional model, algorithm for behaviour dynamic , evaluation for static and dynamic mode, and EmI robotics conception. We begin the article by MAPH and EmotiRob project presentation. Then, we quickly describe the computational model of emotional experience iGrace that we have created, the integration of dynamics and iGrace evaluation. We conclude with a description of the architecture of Emi, as well as improvements to be made to its next generation.