OAK

Multi-modal interruptions on primary task performance

Metadata Downloads
Author(s)
Bovard, Pooja P.Sprehn, Kelly A.Cunha, Meredith G.Chun, JaeminKim, SeungJunSchwartz, Jana L.Garver, Sara K.Dey, Anind K.
Type
Conference Paper
Citation
12th International Conference on Augmented Cognition, AC 2018 Held as Part of HCI International 2018, pp.3 - 14
Issued Date
2018-07
Abstract
In this paper we have investigated a range of multi-modal displays (visual, auditory, haptic) to understand the effects of interruptions across various modalities on response times. Understanding these effects is particularly relevant in complex tasks that require perceptual attention, where pertinent information needs to be delivered to a user, e.g., driving. Multi-modal signal presentation, based on the Multiple Resource Theory framework, is a potential solution. To explore this solution, we conducted a study in which participants perceived and responded to a secondary task while conducting a visual, auditory, and haptic vigilance task during a driving scenario. We analyzed response times, errors, misses, and subjective responses and our results indicated that haptic interruptions of a primarily haptic task can be responded to the fastest, and visual interruptions are not the preferred modality in a driving scenario. With the results of this study, we can define logic for a context-based framework to better determine how to deliver incoming information in a driving scenario.
Publisher
Springer Verlag
Conference Place
US
URI
https://scholar.gist.ac.kr/handle/local/8499
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.