Activation by Interval-wise Dropout A Simple Way to Prevent Neural Networks from Plasticity Loss
- Author(s)
- Park, Sangyeon; Han, Isaac; Oh, Seungwon; Kim, Kyungjoong
- Type
- Conference Paper
- Citation
- 42nd International Conference on Machine Learning, ICML 2025, pp.47991 - 48026
- Issued Date
- 2025-07-19
- Abstract
- Plasticity loss, a critical challenge in neural network training, limits a model’s ability to adapt to new tasks or shifts in data distribution. This paper introduces AID (Activation by Interval-wise Dropout), a novel method inspired by Dropout, designed to address plasticity loss. Unlike Dropout, AID generates subnetworks by applying Dropout with different probabilities on each preactivation interval. Theoretical analysis reveals that AID regularizes the network, promoting behavior analogous to that of deep linear networks, which do not suffer from plasticity loss. We validate the effectiveness of AID in maintaining plasticity across various benchmarks, including continual learning tasks on standard image classification datasets such as CIFAR10, CIFAR100, and TinyImageNet. Furthermore, we show that AID enhances reinforcement learning performance in the Arcade Learning Environment benchmark. © 2025, ML Research Press. All rights reserved.
- Publisher
- ML Research Press
- Conference Place
- CN
Vancouver
- URI
- https://scholar.gist.ac.kr/handle/local/32430
- 공개 및 라이선스
-
- 파일 목록
-
Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.