OAK

Continual Learning for Simultaneous Localization and Mapping

Metadata Downloads
Abstract
Simultaneous localization and mapping(SLAM) solution is essential for various autonomous robot operating application systems. Tremendous efforts have been made to SLAM using cameras or LiDAR. A traditional method is designed based on sparse features or dense maps. With the advancement of deep learning technology, deep learning-based SLAM has been proposed. However, the deep learning method has a catastrophic forgetting problem. It is also difficult to apply the sensor fusion approach. To address these problems, this thesis proposes continual learning for SLAM with the sensor fusion method. 3D LiDAR point cloud converts into a 2D depth map and concatenates with RGB images. Using DepthNet and PoseNet estimate position based on RGBD frames and IMU data then save as graph node. In order to improve adaptation performance, online meta-learning and replay buffer are applied. Every frame feature is extracted by LoopNet and computed cosine similarity for loop closure detection and conducting pose graph optimization. KITTI and Oxford RobotCar datasets are used for experiments. The proposed method not only increase adaptation performance but is also more accurate than some state of the art.
Author(s)
Hyeonsoo Jang
Issued Date
2023
Type
Thesis
URI
https://scholar.gist.ac.kr/handle/local/19033
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.