OAK

Immersion Measurement in Watching Videos Using Eye-tracking Data

Metadata Downloads
Abstract
Immersion plays a crucial role in video watching, leading viewers to a positive experience, such as increased engagement and decreased fatigue. However, few studies measure immersion while watching videos, and questionnaires are typically used in the measurement of immersion for other applications. These methods may rely on the viewer's memory and cause biased results. Therefore, we propose an objective immersion detection model by leveraging people's gaze behavior while watching videos. In a lab study with 30 participants, an in-depth analysis is carried out on a number of gaze features and machine learning (ML) models to identify the immersion state. Several gaze features are highly indicative of immersion and ML models with these features are able to detect an immersion state of video watchers. Post-hoc interviews demonstrate that our approach is applicable to measure immersion in the middle of watching a video, where some practical issues are discussed as well.
Author(s)
Choi, YoujinKim, JooYeongHong, Jin-Hyuk
Issued Date
2022-10
Type
Article
DOI
10.1109/taffc.2022.3209311
URI
https://scholar.gist.ac.kr/handle/local/10568
Publisher
Institute of Electrical and Electronics Engineers
Citation
IEEE Transactions on Affective Computing, v.13, no.4, pp.1759 - 1770
ISSN
1949-3045
Appears in Collections:
Department of AI Convergence > 1. Journal Articles
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.