Immersive Teleoperation of the Eye Gaze of Social Robots – Assessing Gaze-Contingent Control of Vergence, Yaw and Pitch of Robotic Eyes

Conference: ISR 2018 - 50th International Symposium on Robotics
06/20/2018 - 06/21/2016 at München, Germany

Proceedings: ISR 2018

Pages: 8Language: englishTyp: PDF

Personal VDE Members are entitled to a 10% discount on this title

Cambuzat, Remi (GIPSA-lab, Univ. Grenoble-Alpes, CNRS, Grenoble INP, Grenoble, France & CITI-lab, INSA Lyon, INRIA, Lyon, France)
Elisei, Frederic; Bailly, Gerard (GIPSA-lab, Univ. Grenoble-Alpes, CNRS, Grenoble INP, Grenoble, France)
Simonin, Olivier (CITI-lab, INSA Lyon, INRIA, Lyon, France)
Spalanzani, Anne (Université Grenoble-Alpes, INRIA, Grenoble, France)

This paper presents a new teleoperation system – called stereo gaze-contingent steering (SGCS) – able to seamlessly control the vergence, yaw and pitch of the eyes of a humanoid robot – here an iCub robot – from the actual gaze direction of a remote pilot. The video stream captured by the cameras embedded in the mobile eyes of the iCub are fed into an HTC Vive Head-Mounted Display equipped with an SMI binocular eye-tracker. The SGCS achieves the effective coupling between the eye-tracked gaze of the pilot and the robot’s eye movements. SGCS both ensures a faithful reproduction of the pilot’s eye movements – that is perquisite for the readability of the robot’s gaze patterns by its interlocutor – and maintains the pilot’s oculomotor visual clues – that avoids fatigue and sickness due to sensorimotor conflicts. We here assess the precision of this servo-control by asking several pilots to gaze towards known objects positioned in the remote environment. We demonstrate that we succeed in controlling vergence with similar precision as eyes’ azimuth and elevation. This system opens the way for robot-mediated human interactions in the personal space, notably when objects in the shared working space are involved.