Revisiting Dropout: Escaping Pressure for Training Neural Networks with Multiple Costs
- Abstract
- A common approach to jointly learn multiple tasks with a shared structure is to optimize the model with a combined landscape of multiple sub-costs. However, gradients derived from each sub-cost often conflicts in cost plateaus, resulting in a subpar optimum. In this work, we shed light on such gradient conflict challenges and suggest a solution named Cost-Out, which randomly drops the sub-costs for each iteration. We provide the theoretical and empirical evidence of the existence of escaping pressure induced by the Cost-Out mechanism. While simple, the empirical results indicate that the proposed method can enhance the performance of multi-task learning problems, including two-digit image classification sampled from MNIST dataset and machine translation tasks for English from and to French, Spanish, and German WMT14 datasets.
- Author(s)
- Woo, Sangmin; Kim, Kangil; Noh, Junhyug; Shin, Jong-Hun; Na, Seung-Hoon
- Issued Date
- 2021-05
- Type
- Article
- DOI
- 10.3390/electronics10090989
- URI
- https://scholar.gist.ac.kr/handle/local/11526
- 공개 및 라이선스
-
- 파일 목록
-
Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.