OAK

A mixed reality-based remote collaboration framework using improved pose estimation

Metadata Downloads
Author(s)
Oh, InyoungJang, GilsangSong, JinhoSon, MoonguKim, DaewoonYun, JunsangKo, Kwanghee
Type
Article
Citation
Computers in Industry, v.174
Issued Date
2026-01
Abstract
Mixed Reality (MR) technology integrates digital content with the real world to enable a cohesive user experience. Accurate pose estimation is crucial for aligning virtual content with physical surroundings, ensuring the virtual elements appear naturally in the user’s environment. This paper proposes a learning-based approach for accurate pose estimation using a monocular RGB (Red-Green-Blue) image, eliminating the need for markers and depth sensors. The method leverages YOLO6D (You Only Look Once Six-Dimensional) and a RoI (Region of Interest)-based color augmentation technique combined with Principal Component Analysis to enhance the accuracy of 6-DoF (Degrees of Freedom) pose estimation, while mitigating the effects of background variations and lighting changes. The proposed pose estimation method is incorporated into an MR-based remote collaboration framework, ensuring consistent and robust information rendering onto target objects across various devices. This integration enhances the reliability and effectiveness of MR-based remote collaboration. Experimental results demonstrate the superior performance of the proposed method, establishing it as a strong foundation for future MR-based remote collaboration frameworks. © © 2025. Published by Elsevier B.V.
Publisher
Elsevier B.V.
ISSN
0166-3615
DOI
10.1016/j.compind.2025.104414
URI
https://scholar.gist.ac.kr/handle/local/32343
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.