OAK

Multiple Projector Camera Calibration by Fiducial Marker Detection

Metadata Downloads
Abstract
Projection mapping has been used for various purposes in everyday situations. A key step in projection mapping is to project images and videos without distortion through calibration, which is typically performed manually. Calibration becomes more challenging when multiple projectors are involved. To address this issue, a fully automated calibration method for a multi-projector-camera system is proposed in this paper. The projectors and cameras are assumed to be un-calibrated, and an arbitrary geometric shape of the projection surface is considered. Without using checkboards or user-provided parameters, the proposed method can automatically estimate calibration parameters for the cameras and projectors and generate compensated content for projection without distortion within a reasonable amount of time. The proposed method utilizes AprilTag markers and modified YOLOv8 with deformable convolution for robust marker detection and correspondence estimation between the projectors and cameras, providing an automatic process for completing calibration and distortion correction. Various experiments have demonstrated that the proposed method outperforms existing methods using checkerboards in terms of calibration accuracy and processing time across various camera-projector configurations. The proposed method can minimize the difficulty of projection mapping, allowing it to be used in everyday situations without requiring a certain level of knowledge about projection mapping theory and related hardware.
Author(s)
Son, MoonguKo, Kwanghee
Issued Date
2023-07
Type
Article
DOI
10.1109/ACCESS.2023.3299857
URI
https://scholar.gist.ac.kr/handle/local/10115
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Citation
IEEE ACCESS, v.11, pp.78945 - 78955
ISSN
2169-3536
Appears in Collections:
Department of Mechanical and Robotics Engineering > 1. Journal Articles
공개 및 라이선스
  • 공개 구분공개
파일 목록

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.