OAK

Online federated learning based short-term load prediction through anomaly detection using incremental PCA

Metadata Downloads
Abstract
In this paper, we propose an incremental principle component analysis (IPCA)-based federated learning (FL) framework for day-ahead load forecasting, where IPCA is utilized to reflect seasonal variation and detect anomalies when training models in local devices. FL is a promising learning scheme for leveraging distributed data and computational resources while preserving user privacy. However, FL is susceptible to anomalies in local devices, which can degrade the overall performance of the global model. These anomalies can include corrupted data, adversarial attacks, or simply unreliable data. In particular, in the case of power consumption profiles, seasonal variations can mask such anomalies because the deviation of anomalies can be smaller than that of seasonal variations. To address this issue, in our framework, mini batches for training models in local devices are transformed by IPCA matrices into low dimensional spaces, where Mahalanobis distances between previous and current mini batch are calculated to determine whether re-initialization is needed by seasonal variation or or anomaly is declared for the current mini batch. To further improve prediction performance, we customized the IPCA-FL by adding an online learning scheme with greedy – i –selection algorithm for model updates. We evaluated the two proposed schemes (offline IPCA-FL and online IPCA-FL with greedy selection) on Gwangju Institute of Science and Technology (GIST) building electricity load dataset (i.e., research bldg. 8). The results showed that our schemes outperformed traditional centralized learning and existing FL methods in terms of root mean square error (RMSE) and mean absolute percentage error (MAPE) metrics. Moreover, under various scenarios with less than one year of data, federated learning showed superior performance compared to individual learning. however, it was observed that individual learning outperformed our methods when the training data is long enough to include seasonal variations.
Author(s)
Park Seong-Woo
Issued Date
2024
Type
Thesis
URI
https://scholar.gist.ac.kr/handle/local/19540
Alternative Author(s)
박성우
Department
대학원 AI대학원
Advisor
Hwang, Eui Seok
Degree
Master
Appears in Collections:
Department of AI Convergence > 3. Theses(Master)
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.