Assessing Consumer Attention and Arousal Using Eye-Tracking Technology in Virtual Retail Environment

Nayeon Kim, Hyunsoo Lee

Research output: Contribution to journalArticlepeer-review

17 Citations (Scopus)


This study aims at investigating how consumers experience the retail environment visually, thus establishing a foundation for deeper insights into visual merchandising strategies. Specifically, we experimentally recorded and analyzed the visual attention and emotional arousal of the consumers in a test setting and examined the influence of various elements as well as gender differences in the recorded consumer responses. We conducted an experiment utilizing eye-tracking and virtual reality to analyze visual attention and emotional arousal in response to spatial and design elements in an immersive retail environment. We examined real-time measures of consumer interest and emotional responses during the retail experience. Valid gaze data from 24 male and 22 female participants were used for the analysis of total dwell time (TDT), total fixation count (TFC), and average pupil diameter (APD). The visual attention and emotional arousal of consumers showed different responses to specific areas of interest according to different spatial arrangements in the sales and service areas. This study statistically analyzed gender differences in consumer responses and performed a correlation analysis between visual attention and emotional arousal. Our findings provide insight into improving the design of retail environments for target consumers and contribute to building visual merchandising strategies.

Original languageEnglish
Article number665658
JournalFrontiers in Psychology
Publication statusPublished - 2021 Aug 9

Bibliographical note

Publisher Copyright:
© Copyright © 2021 Kim and Lee.

All Science Journal Classification (ASJC) codes

  • General Psychology


Dive into the research topics of 'Assessing Consumer Attention and Arousal Using Eye-Tracking Technology in Virtual Retail Environment'. Together they form a unique fingerprint.

Cite this