Developing Enabling Technologies for Ambient Assisted Living: Natural Language Interfaces, Automatic Focus Detection and User State Recognition

Konferenz: Ambient Assisted Living - AAL - 1. Deutscher AAL-Kongress mit Ausstellung / Technologien - Anwendungen - Management
30.01.2008 - 01.02.2008 in Berlin, Germany

Tagungsband: Ambient Assisted Living - AAL

Seiten: 5Sprache: EnglischTyp: PDF

Persönliche VDE-Mitglieder erhalten auf diesen Artikel 10% Rabatt

Autoren:
Hönig, Florian (Institute of Pattern Recognition (LME), University of Erlangen-Nuremberg, Germany)
Hacker, Christian; Nöth, Elmar; Hornegger, Joachim (LME, University of Erlangen-Nuremberg, Germany)
Warnke, Volker (Sympalog Voice Solutions GmbH, Erlangen, Germany)
Kornhuber, Johannes (Department of Psychiatry and Psychotherapy, University Hospital Erlangen, Germany)

Inhalt:
Parallel to the demographic change, research on technologies for assisted living to support the "silver generation" is growing. Indispensable for such technology are intuitive and practical user interfaces. We take an approach based on automatic speech recognition that satisfies these requirements. We present a prototypical system that provides e.g. control of household appliances or initiates and accepts telephone calls. It has a natural-language interface: spontaneous speech is allowed, and the user does not have to learn special commands. Also reminder functions for taking medicine and an emergency call are implemented. Wishing to avoid a "push-to-talk-button", we provide algorithms for automatic recognition of the intended addressee of the speech. Additionally, we classify the focus of attention from video recordings which can be useful if the system's interface uses an avatar. Another goal in assisted living is to automatically monitor the user's health which can be accomplished by measuring body functions. Sensors for acquiring these physiological signals can nowadays be integrated into the clothing; the signals can be transmitted wirelessly. In this paper, generic algorithms for automatic distinction between affective user states are presented. In the future, we will apply these solutions to health monitoring.