AMA: Asymptotic Midpoint Augmentation for Margin Balancing and Moderate Broadening
- Abstract
- Feature augmentation in neural networks is an effective regularization method to adjust the margin in feature space. However, a similar approach in terms of directly repositioning features, contrastive learning, has reported collapse problems of inter-class and intra-class features. The augmentation approaches are also related to the issues, but have been barely analyzed. In this paper, we show that feature augmentation methods are also affected by the collapse problems and address them by proposing a novel method to generate augmented features gradually approaching the midpoint of inter-class feature pairs, called asymptotic midpoint augmentation (AMA). The method induces two effects: 1) balancing the margin for all classes and 2) only moderately broadening the margin until it holds maximal confidence. We empirically analyze alignment and uniformity to show vulnerability to the problems in a toy task. Then, we validate its impacts in original, long-tailed, and coarse-to-fine transfer tasks on CIFAR-10 and CIFAR-100. To enhance generality, we additionally analyze its relation to a representative input-level augmentation such as Mixup.
- Author(s)
- Semi Lee
- Issued Date
- 2023
- Type
- Thesis
- URI
- https://scholar.gist.ac.kr/handle/local/18858
- 공개 및 라이선스
-
- 파일 목록
-
Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.