OAK

Event-based Stereo Depth Estimation

Metadata Downloads
Author(s)
Yeongwoo Nam
Type
Thesis
Degree
Master
Department
대학원 전기전자컴퓨터공학부
Advisor
Choi, Jonghyun
Abstract
Neuromorphic cameras or event cameras mimic human vision by reporting changes in the intensity in a scene, instead of reporting the whole scene at once in the form of a single frame as performed by conventional cameras. Events are streamed data that is often dense when either the scene changes or the camera moves rapidly. The rapid movement causes the events to be overridden or missed when creating a tensor for the machine to learn on. Here, we propose to learn to concentrate on the dense event to produce a sharp compact event representation with high details for depth estimation. Specifically, we learn a model with events from both past and future but infer only with past data with the predicted future. We initially estimate depth in an event-only setting but also propose to further incorporate images and events by a hierarchical event and intensity combination network to predict higher quality depth. By experiments in challenging real-world scenarios, we validate the superiority of our method compared to prior arts.
URI
https://scholar.gist.ac.kr/handle/local/19265
Fulltext
http://gist.dcollection.net/common/orgView/200000884825
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.