Hand gesture based control with multi-modality data - towards surgical applications
Konferenz: BIBE 2019 - The Third International Conference on Biological Information and Biomedical Engineering
20.06.2019 - 22.06.2019 in Hangzhou, China
Tagungsband: BIBE 2019
Seiten: 4Sprache: EnglischTyp: PDFPersönliche VDE-Mitglieder erhalten auf diesen Artikel 10% Rabatt
Sun, Yu; Miao, Lijie; Yuan, Zhenming (School of Information Science and Engineering, Hangzhou Normal University, Hangzhou, Zhejiang, China)
Sun, Xiaoyan (School of Information Science and Engineering, Hangzhou Normal University, Hangzhou, Zhejiang, China & Engineering Research Center of Mobile Health Management System, Ministry of Education, Hangzhou, Zhejiang, China)
Image-guided surgery provides the surgeon with additional information to perform the operation more accurately. However, how to visualize the preoperative planning in an interactive way remains a challenge. Hand gesture based control provides touchless user interface thus a potential for surgical environment. Comparing to static hand gestures, dynamic gestures provide more natural way for human control. However, considering the complexity of dynamic hand gestures, there are challenges associated with sufficient feature extraction. In this paper, a multi-modality system was proposed for dynamic hand gesture recognition. Two types of features were defined including finger pose and hand motion, which were extracted by combining information from Leap Motion and depth images. A Long Short-Term Memory (LSTM) network was trained based on the extracted feature sequence for hand gesture recognition. Experiments were performed to evaluate the feasibility of the system with a 16-class dynamic gesture dataset, and an average accuracy of 94.10% was achieved. The results demonstrated that depth images can effectively compensate for feature missing due to lost tracking by Leap Motion, thus providing the possibility for surgical applications.