OAK

Rethinking Softmax Cross Entropy for Imbalanced Image Classification

Metadata Downloads
Author(s)
Yechan Kim
Type
Thesis
Degree
Master
Department
대학원 전기전자컴퓨터공학부
Advisor
Jeon, Moongu
Abstract
Recently, deep learning models have achieved great success in computer vision applications, relying on large-scale class-balanced datasets. However, imbalanced class distributions still limit the wide applicability of these models due to degradation in performance. To solve this problem, this thesis concentrates on the study of cross entropy which mostly ignores output scores on incorrect classes. This work discovers that neutralizing predicted probabilities on incorrect classes helps improve the prediction accuracy for imbalanced image classification. This thesis proposes a simple but effective loss named complement cross entropy based on this finding. The proposed loss makes the ground truth class overwhelm the other classes in terms of softmax probability, by neutralizing probabilities of incorrect classes, without additional training procedures. Along with it, this loss facilitates the models to learn key information especially from samples on minority classes. It ensures more accurate and robust classification results on imbalanced distributions. Extensive experiments on imbalanced datasets demonstrate the effectiveness of the proposed method.
URI
https://scholar.gist.ac.kr/handle/local/33384
Fulltext
http://gist.dcollection.net/common/orgView/200000905445
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.