Abstract
Developing a user interface (UI) suitable for headset environments is one of the challenges in the field of augmented reality (AR) technologies. This study proposes a hands-free UI for an AR headset that exploits facial gestures of the wearer to recognize user intentions. The facial gestures of the headset wearer are detected by a custom-designed sensor that detects skin deformation based on infrared diffusion characteristics of human skin. We designed a deep neural network classifier to determine the user’s intended gestures from skin-deformation data, which are exploited as user input commands for the proposed UI system. The proposed classifier is composed of a spatiotemporal autoencoder and deep embedded clustering algorithm, trained in an unsupervised manner. The UI device was embedded in a commercial AR headset, and several experiments were performed on the online sensor data to verify operation of the device. We achieved implementation of a hands-free UI for an AR headset with average accuracy of 95.4% user-command recognition, as determined through tests by participants.
Original language | English |
---|---|
Article number | 4441 |
Journal | Sensors (Switzerland) |
Volume | 19 |
Issue number | 20 |
DOIs | |
Publication status | Published - 2019 Oct 2 |
Bibliographical note
Publisher Copyright:© 2019 by the authors. Licensee MDPI, Basel, Switzerland.
All Science Journal Classification (ASJC) codes
- Analytical Chemistry
- Information Systems
- Instrumentation
- Atomic and Molecular Physics, and Optics
- Electrical and Electronic Engineering
- Biochemistry