OAK

Gaze-Head Input: Examining Potential Interaction with Immediate Experience Sampling in an Autonomous Vehicle

Metadata Downloads
Abstract
Autonomous vehicles (AV) increasingly allow drivers to engage in secondary tasks such as eating or working on a laptop and thus require easy and reliable interaction inputs to facilitate communication between the driver and the vehicle. However, drivers report feeling less in control when driving is no longer the primary task, which suggests that novel approaches for assessing satisfaction regarding AV decision-making are needed. Therefore, we propose an immediate experience sampling method (IESM) that learns driver preferences for AV actions. We also suggest gaze-head input (G-HI) as a novel input in an AV. G-HI provides a hands-free, remote, and intuitive input modality that allows drivers to interact with the AV while continuing to engage in non-driving related tasks. We compare G-HI with voice and touch inputs via IESM for two simulated driving scenarios. Our results report the differences among the three inputs in terms of system usability, reaction time, and perceived workload. It also reveals that G-HI is a promising candidate for AV input interaction, which could replace voice or touch inputs where those inputs could not be utilized. Variation in driver satisfaction and expectations for AV actions confirms the effectiveness of using IESM to increase drivers’ sense of control.
Author(s)
Ataya, AyaKim, WonElsharkawy, AhmedKim, SeungJun
Issued Date
2020-12
Type
Article
DOI
10.3390/app10249011
URI
https://scholar.gist.ac.kr/handle/local/8746
Publisher
MDPI
Citation
Applied Sciences-basel, v.10, no.24, pp.1 - 17
ISSN
2076-3417
Appears in Collections:
Department of AI Convergence > 1. Journal Articles
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.