OAK

EGNAS: Efficient Graph Neural Architecture Search through Evolutionary Algorithm

Metadata Downloads
Abstract
The integration of Neural Architecture Search (NAS) techniques into Graph Neural Networks (GNNs) has demonstrated promising outcomes. In this paper, we present the Efficient Graph Neural Architecture Search (EGNAS) algorithm, which aims to tackle the challenges in GNN NAS. EGNAS introduces novel techniques, including inherited parameter sharing and the half epochs technique, to improve the efficiency and diversity of architecture search. Additionally, combined evolutionary search with hyperparameter is incorporated to explore a wider range of GNN candidates. Experimental results demonstrate that EGNAS outperforms handcrafted methods and other GNN NAS approaches in terms of both test accuracy and search time. By striking a balance between performance and computational efficiency, EGNAS offers an efficient and effective solution for discovering optimal graph neural architectures.
Author(s)
Younkyung Jwa
Issued Date
2023
Type
Thesis
URI
https://scholar.gist.ac.kr/handle/local/19228
Alternative Author(s)
좌윤경
Department
대학원 AI대학원
Advisor
Ahn, Chang Wook
Degree
Master
Appears in Collections:
Department of AI Convergence > 3. Theses(Master)
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.