OAK

Federated Distillation with Dataset Distillation Using Model Trajectory Matching

Metadata Downloads
Abstract
Federated Learning (FL) is a machine learning technique that trains the global model on the server by leveraging the knowledge of clients without sharing the private data. In federated learning, knowledge distillation, widely known as federated distillation, is utilized for model combination to address data-heterogeneous problems arising from the non-IID distribution of private data across clients. However, existing federated distillation methods require additional training data or models to distill the knowledge of local clients. Previous federated distillation methods utilize public data, which has a different distribution from private data, or artificial data created by a generator adversarially trained on local models. However, we supposed that artificial data created by generator cannot precisely represent the semantic information of private data. To address this issue, we propose a new federated distillation framework using dataset distillation with model trajectory matching. To create effective training artificial data, we train the artificial data by comparing local model parameters before and after learning from private data. We prove that using artificial data generated by dataset distillation with model trajectory matching shows better result than generator. Additionally, we take verification experiment to compare how much the artificial data generated by generator and dataset distillation contains semantic information of private data.
Author(s)
Hong Mingi
Issued Date
2024
Type
Thesis
URI
https://scholar.gist.ac.kr/handle/local/19302
Alternative Author(s)
홍민기
Department
대학원 AI대학원
Advisor
Kim, Kangil
Degree
Master
Appears in Collections:
Department of AI Convergence > 3. Theses(Master)
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.