OAK

Efficient Neural Network Space with Genetic Search

Metadata Downloads
Abstract
We present a novel neural architecture search space and its search strategy with an evolutionary algorithm. It aims to find a set of inverted bottleneck structure blocks, which takes a low-dimensional input representation followed by a compressing layer. Primitive operation layers constitute flexible inverted bottleneck blocks and can be assembled in evolutionary operation. Because the bottleneck structure confines the search space, the proposed evolutionary search algorithm can easily find a competitive neural network despite its small population size. During the search process, we designed to evaluate a model to avoid local minimums: such implementation helped the algorithm to discard local minimums and find better models. We conducted experiments on image classification of Fashion-MNIST, and we discovered an efficiently optimized neural network achieving 6.76 for an error rate with 356K parameters. © 2020, Springer Nature Singapore Pte Ltd.
Author(s)
Kang, DongseokAhn, Chang Wook
Issued Date
2019-11-22
Type
Conference Paper
DOI
10.1007/978-981-15-3415-7_54
URI
https://scholar.gist.ac.kr/handle/local/22838
Publisher
Springer
Citation
14th International Conference on Bio-inspired Computing: Theories and Applications, BIC-TA 2019, pp.638 - 646
ISSN
1865-0929
Conference Place
GE
Appears in Collections:
Department of AI Convergence > 2. Conference Papers
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.