OAK

Spectral-invariant matching network

Metadata Downloads
Author(s)
Ko, Y.Jang, Y.-J.Dinh, V.Q.Jeon, H.-G.Jeon, M.
Type
Article
Citation
Information Fusion, v.91, pp.623 - 632
Issued Date
2023-03
Abstract
As the need for sensor fusion systems has grown, developing methods to find correspondences between images with different spectral ranges has become increasingly important. Since most images do not share low-level information, such as textures and edges, existing matching approaches fail even with convolutional neural networks (CNNs). In this paper, we propose an end-to-end metric learning method, called SPIMNet (SPectral-Invariant Matching Network) for robust cross- and multi-spectral image patch matching. While existing methods based on CNNs learn matching features directly from cross- and multi-spectral image patches, SPIMNet transforms across spectral bands and discriminates for similarity in three steps. First, (1) SPIMNet is adjusted for a feature domain by introducing a domain translation network; then (2) two Siamese networks learn to match the adjusted features with the same spectral domain; and (3) the matching features are fed to fully-connected layers to determine the identity of the patches as a classification task. By effectively incorporating each step, SPIMNet achieved competitive results on a variety of challenging datasets, including both VIS–NIR and VIS–Thermal image pairs. Our code is available at https://github.com/koyeongmin/SPIMNet. © 2022
Publisher
Elsevier BV
ISSN
1566-2535
DOI
10.1016/j.inffus.2022.10.033
URI
https://scholar.gist.ac.kr/handle/local/10330
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.