OAK

TimelyTale: A Multimodal Dataset Approach to Assessing Passengers’ Explanation Demands in Highly Automated Vehicles

Metadata Downloads
Abstract
Explanations in automated vehicles enhance passengers’ understanding of vehicle decision-making, mitigating negative experiences by increasing their sense of control. These explanations help maintain situation awareness, even when passengers are not actively driving, and calibrate trust to match vehicle capabilities, enabling safe engagement in non-driving related tasks. While design studies emphasize timing as a crucial factor affecting trust, machine learning practices for explanation generation primarily focus on content rather than delivery timing. This discrepancy could lead to mistimed explanations, causing misunderstandings or unnecessary interruptions. This gap is partly due to a lack of datasets capturing passengers’ real-world demands and experiences with in-vehicle explanations. We introduce TimelyTale, an approach that records passengers’ demands for explanations in automated vehicles. The dataset includes environmental, driving-related, and passenger-specific sensor data for context-aware explanations. Our machine learning analysis identifies proprioceptive and physiological data as key features for predicting passengers’ explanation demands, suggesting their potential for generating timely, context-aware explanations. The TimelyTale dataset is available at https://doi.org/10.7910/DVN/CQ8UB0. © 2024 Copyright held by the owner/author(s).
Author(s)
Kim, GwangbinHwang, SeokhyunSeong, MinwooYeo, DohyeonRus, DanielaKim, Seungjun
Issued Date
2024-09
Type
Article
DOI
10.1145/3678544
URI
https://scholar.gist.ac.kr/handle/local/9356
Publisher
Association for Computing Machinery
Citation
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, v.8, no.3
ISSN
2474-9567
Appears in Collections:
Department of AI Convergence > 1. Journal Articles
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.