Seeing and Sensing Intention: Resolving Midas Touch with Vision-Motion Fusion in Mixed Reality
- Author(s)
- Lee, Daeho; Lee, Chungha; Hong, Jin-hyuk
- Type
- Conference Paper
- Citation
- 38th Annual ACM Symposium on User Interface Software and Technology, UIST 2025
- Issued Date
- 2025-09-28
- Abstract
- The Midas Touch problem in XR (Extended Reality) environments is a significant factor that can degrade user experience. The vision-based interaction often leads to false alarms, where unintended user gestures are recognized. Particularly in MR environments, users can interact with virtual and real objects alternately, posing a risk of misrecognizing gestures intended for real objects as intentions towards virtual objects. We collected two kinds of data during interactions across multiple MR environments to analyze and address the Midas Touch problem: (1) false alarm data from interactions with real objects and (2) true pinch data from interactions with virtual objects. We present a multimodal model and analysis to understand the Midas Touch problem in MR environments. © 2025 Elsevier B.V., All rights reserved.
- Publisher
- Association for Computing Machinery, Inc
- Conference Place
- KO
Busan
- URI
- https://scholar.gist.ac.kr/handle/local/32304
- 공개 및 라이선스
-
- 파일 목록
-
Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.