Person Re-Identification and Model Compression for Visual Surveillance Systems
- Author(s)
- Hyunguk Choi
- Type
- Thesis
- Degree
- Doctor
- Department
- 대학원 전기전자컴퓨터공학부
- Advisor
- Jeon, Moongu
- Abstract
- Multi-target tracking in a non-overlapping camera network is an active research field, and one of the important problems in it is person re-identification problem.
In the person re-identification, we propose a method to improve the performance of the backbone model. Our method focuses on training a fusion model with a shallow model and making hard triplets with relationship matrices quickly and efficiently.
The proposed method is simple, but it improves the performance of the backbone. In addition, the hard triplet mining in our process is much faster than the conventional method. Experimental evaluation shows that the proposed method can improve the performances of the backbone model in result tables and feature visualization.
The proposed method improves rank-1 and mAP performance by more than 12.54% and 15.44% respectively over the backbone models in the Market1501 and DukeMTMC-reID dataset. The method also achieves competitive performances compared with state-of-the-art methods.
In real-world environments, Deep neural networks perform well but require high-performance hardware for their use in real-world environments.
Knowledge distillation is one of a famous method to solve the problems.
The knowledge distillation is a simple method for improving the performance of a small network by using the knowledge of a large complex network.
Small and large networks are referred to as student and teacher models, respectively.
Previous knowledge distillation methods perform well in a relatively small teacher network (20~30 layers) but poorly in large teacher networks (50 layers).
Here, we propose a method called block change learning that performs local and global knowledge distillation by changing blocks comprised of layers.
The method focuses on the knowledge transfer without losing information in a large teacher model, as the method considers intra-relationships between layers using local knowledge distillation and inter-relationships between corresponding blocks.
- URI
- https://scholar.gist.ac.kr/handle/local/32942
- Fulltext
- http://gist.dcollection.net/common/orgView/200000907916
- 공개 및 라이선스
-
- 파일 목록
-
Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.