A gaze-based attention model for spatially-aware hearing aids

Konferenz: Speech Communication - 13. ITG-Fachtagung Sprachkommunikation
10.10.2018 - 12.10.2018 in Oldenburg, Deutschland

Tagungsband: Speech Communication

Seiten: 5Sprache: EnglischTyp: PDF

Persönliche VDE-Mitglieder erhalten auf diesen Artikel 10% Rabatt

Autoren:
Grimm, Giso; Kayser, Hendrik; Hendrikse, Maartje (Medizinische Physik and Cluster of Excellence "Hearing4all", Department of Medical Physics and Acoustics, University of Oldenburg, Germany)
Volker Hohmann,

Inhalt:
Spatial filtering and decomposition of sounds into acoustic source objects is increasingly investigated for speech enhancement in hearing aids. However, with increasing performance and availability of these ’space aware’ algorithms, knowledge of the user’s personal listening preferences and of attended sources becomes crucial. In this approach eye gaze direction is combined with an acoustic analysis of the sound source positions to identify the attended source from a mixture of sources in an audiovisual scene. Gaze direction is recorded by electrooculography and a head tracking system, which would be feasible also in hearing aids. The spatio-temporal distribution of source positions is estimated from input signals to a binaural hearing device. The gaze direction of 14 listeners was recorded in a a multi-talker steered-attention task to estimate the attended source. Simulated spatial filtering based on the individual gaze demonstrated an SNR benefit up to 7 dB.