Enhancing emotion recognition using multimodal fusion of physiological, environmental, personal data

Hakpyeong Kim, Taehoon Hong

Research output: Contribution to journalArticlepeer-review

8 Citations (Scopus)

Abstract

Human emotion recognition, crucial for interpersonal relations and human-building interaction, identifies emotions from various behavioral signals to improve user interactions. To enhance the performance of emotion recognition, this study proposed a novel model that fuses physiological, environmental, and personal data. A unique dataset was created via experiments conducted in an environmental chamber, and an emotion recognition model was subsequently developed using a multimodal fusion approach. The model transforms physiological data into 2D images to capture time series and spatial features and uniquely incorporates metadata, including environmental and personal data. The model's generalizability was validated using a leave-one-sample-out approach. The result showed 31.6% reduction of error with a predicted area when physiological, environmental, and personal data were fused in the emotion recognition model, suggesting that incorporating various contextual factors beyond physiological changes, such as the surrounding environment and inherent or acquired individual traits, can significantly enhance the model's understanding of emotions. Furthermore, the model was to be robust to individual differences, offering consistent emotion recognition across different subjects. These findings suggest that the proposed model can serve as a potent tool for emotion recognition in built environmental applications.

Original languageEnglish
Article number123723
JournalExpert Systems with Applications
Volume249
DOIs
Publication statusPublished - 2024 Sept 1

Bibliographical note

Publisher Copyright:
© 2024 Elsevier Ltd

All Science Journal Classification (ASJC) codes

  • General Engineering
  • Computer Science Applications
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Enhancing emotion recognition using multimodal fusion of physiological, environmental, personal data'. Together they form a unique fingerprint.

Cite this