OAK

EGNAS: Efficient Graph Neural Architecture Search Through Evolutionary Algorithm

Metadata Downloads
Abstract
The primary objective of our research is to enhance the efficiency and effectiveness of Neural Architecture Search (NAS) with regard to Graph Neural Networks (GNNs). GNNs have emerged as powerful tools for learning from unstructured network data, compensating for several known limitations of Convolutional Neural Networks (CNNs). However, the automatic search for optimal GNN architectures has seen little progressive advancement so far. To address this gap, we introduce the Efficient Graph Neural Architecture Search (EGNAS), a method that leverages the advantages of evolutionary search strategies. EGNAS incorporates inherited parameter sharing, allowing offspring to inherit parameters from their parents, and utilizes half epochs to improve optimization stability. In addition, EGNAS employs a combined evolutionary search, which explores both the model structure and the hyperparameters within a large search space, resulting in improved performance. Our experimental results demonstrate that EGNAS outperforms state-of-the-art methods in node classification tasks on the Cora, Citeseer, and PubMed datasets while maintaining a high degree of computational efficiency. In particular, EGNAS is the fastest GNN architecture search method in terms of search time, particularly when compared to precedently suggested evolutionary search strategies, delivering performance up to 40 times faster. © 2024 by the authors.
Author(s)
Jwa, YounkyungAhn, Chang WookKim, Man-Je
Issued Date
2024-12
Type
Article
DOI
10.3390/math12233828
URI
https://scholar.gist.ac.kr/handle/local/9168
Publisher
Multidisciplinary Digital Publishing Institute (MDPI)
Citation
Mathematics, v.12, no.23
ISSN
2227-7390
Appears in Collections:
Department of AI Convergence > 1. Journal Articles
공개 및 라이선스
  • 공개 구분공개
파일 목록

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.