Hands-free user interface for AR/VR devices exploiting wearer’s facial gestures using unsupervised deep learning

Jaekwang Cha, Jinhyuk Kim, Shiho Kim

Research output: Contribution to journalArticlepeer-review

6 Citations (Scopus)

Abstract

Developing a user interface (UI) suitable for headset environments is one of the challenges in the field of augmented reality (AR) technologies. This study proposes a hands-free UI for an AR headset that exploits facial gestures of the wearer to recognize user intentions. The facial gestures of the headset wearer are detected by a custom-designed sensor that detects skin deformation based on infrared diffusion characteristics of human skin. We designed a deep neural network classifier to determine the user’s intended gestures from skin-deformation data, which are exploited as user input commands for the proposed UI system. The proposed classifier is composed of a spatiotemporal autoencoder and deep embedded clustering algorithm, trained in an unsupervised manner. The UI device was embedded in a commercial AR headset, and several experiments were performed on the online sensor data to verify operation of the device. We achieved implementation of a hands-free UI for an AR headset with average accuracy of 95.4% user-command recognition, as determined through tests by participants.

Original languageEnglish
Article number4441
JournalSensors (Switzerland)
Volume19
Issue number20
DOIs
Publication statusPublished - 2019 Oct 2

Bibliographical note

Publisher Copyright:
© 2019 by the authors. Licensee MDPI, Basel, Switzerland.

All Science Journal Classification (ASJC) codes

  • Analytical Chemistry
  • Information Systems
  • Instrumentation
  • Atomic and Molecular Physics, and Optics
  • Electrical and Electronic Engineering
  • Biochemistry

Fingerprint

Dive into the research topics of 'Hands-free user interface for AR/VR devices exploiting wearer’s facial gestures using unsupervised deep learning'. Together they form a unique fingerprint.

Cite this