OAK

Activation by Interval-wise Dropout: A Simple Way to Prevent Neural Networks from Plasticity Loss

Metadata Downloads
Author(s)
Sangyeon Park
Type
Thesis
Degree
Master
Department
정보컴퓨팅대학 AI융합학과
Advisor
Kim, KyungJoong
Abstract
Plasticity loss, a critical challenge in neural network training, limits a model's ability to adapt to new tasks or shifts in data distribution. This paper introduces AID (Activation by Interval-wise Dropout), a novel method inspired by Dropout, designed to address plasticity loss. Unlike Dropout, AID generates subnetworks by applying Dropout with different probabilities on each preactivation interval. Theoretical analysis reveals that AID regularizes the network, promoting behavior analogous to that of deep linear networks, which do not suffer from plasticity loss. We validate the effectiveness of AID in maintaining plasticity across various benchmarks, including continual learning tasks on standard image classification datasets such as CIFAR10, CIFAR100, and TinyImageNet. Furthermore, we show that AID enhances reinforcement learning performance in the Arcade Learning Environment benchmark.
URI
https://scholar.gist.ac.kr/handle/local/33667
Fulltext
http://gist.dcollection.net/common/orgView/200000946311
Alternative Author(s)
박상연
Appears in Collections:
Department of AI Convergence > 3. Theses(Master)
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.