Abstract
Federated Learning (FL) has been proposed as a decentralized machine learning system where multiple clients jointly train the model without sharing private data. In FL, the statistical heterogeneity among devices has become a crucial challenge, which can cause degradation in generalization performance. Previous FL approaches have proven that leveraging the proximal regularization at the local training process can alleviate the divergence of parameter aggregation from biased local models. In this work, to address the heterogeneity issues in conventional FL, we propose a layer-wise knowledge distillation method in federated learning, namely, FedLKD, which regularizes the local training step via the knowledge distillation scheme between global and local models utilizing the small proxy dataset. Hence, FedLKD deploys the layer-wise knowledge distillation of the multiple devices and the global server as the clients' regularized loss function. A layer-wise knowledge distillation mechanism is introduced to update the local model to exploit the common representation from different layers. Through extensive experiments, we demonstrate that FedLKD outperforms the vanilla FedAvg and FedProx on three federated datasets.
Original language | English |
---|---|
Title of host publication | 37th International Conference on Information Networking, ICOIN 2023 |
Publisher | IEEE Computer Society |
Pages | 526-529 |
Number of pages | 4 |
ISBN (Electronic) | 9781665462686 |
DOIs | |
Publication status | Published - 2023 |
Event | 37th International Conference on Information Networking, ICOIN 2023 - Bangkok, Thailand Duration: 11 Jan 2023 → 14 Jan 2023 |
Publication series
Name | International Conference on Information Networking |
---|---|
Volume | 2023-January |
ISSN (Print) | 1976-7684 |
Conference
Conference | 37th International Conference on Information Networking, ICOIN 2023 |
---|---|
Country/Territory | Thailand |
City | Bangkok |
Period | 11/01/23 → 14/01/23 |
Bibliographical note
Publisher Copyright:© 2023 IEEE.
Keywords
- Federated Learning
- Knowledge Distillation