OAK

DenSFA-PU: Learning to unwrap phase in severe noisy conditions

Metadata Downloads
Abstract
In optics, phase measurement techniques face challenges because phase values are confined within 2π, leading to the problem of phase unwrapping. Many methods, including deep learning-based approaches, have been proposed to address this issue. However, high noise in a wrapped phase image often causes these techniques to fail, resulting in error accumulation and high computation time. To overcome these challenges, we propose a robust and fast deep learning-based method called DenSFA-PU (Densely Connected Spatial Feature Aggregator for Phase Unwrapping), which treats this problem as a regression task. Our approach uses an encoder-decoder architecture with densely connected neural networks and a spatial feature aggregator module for noise reduction and robust feature representation. Comparative analysis using both synthetic data and real-world data obtained through digital holography demonstrates that our method outperforms existing techniques, achieving greater computational efficiency with an average unwrapping time of 29.31 ms, significantly faster than other methods. It also shows superior accuracy, with consistently good NRMSE, PSNR, and SSIM values across all cases, highlighting its robustness in handling highly noisy wrapped phase images. Additionally, its ability to operate with minimal training data makes it highly suitable for the applications requiring fast and accurate phase unwrapping with a limited data set. © 2025 The Authors
Author(s)
Awais, MuhammadYoon, TaeilHwang, Chi-OkLee, Byeongha
Issued Date
2025-09
Type
Article
DOI
10.1016/j.optlastec.2025.112757
URI
https://scholar.gist.ac.kr/handle/local/8949
Publisher
Elsevier Ltd
Citation
Optics and Laser Technology, v.187
ISSN
0030-3992
Appears in Collections:
Department of Electrical Engineering and Computer Science > 1. Journal Articles
Department of Mathematical Sciences > 1. Journal Articles
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.