Layer-wise Knowledge Distillation for Cross-Device Federated Learning

Huy Q. Le, Loc X. Nguyen, Seong Bae Park, Choong Seon Hong

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

6 Citations (Scopus)

Abstract

Federated Learning (FL) has been proposed as a decentralized machine learning system where multiple clients jointly train the model without sharing private data. In FL, the statistical heterogeneity among devices has become a crucial challenge, which can cause degradation in generalization performance. Previous FL approaches have proven that leveraging the proximal regularization at the local training process can alleviate the divergence of parameter aggregation from biased local models. In this work, to address the heterogeneity issues in conventional FL, we propose a layer-wise knowledge distillation method in federated learning, namely, FedLKD, which regularizes the local training step via the knowledge distillation scheme between global and local models utilizing the small proxy dataset. Hence, FedLKD deploys the layer-wise knowledge distillation of the multiple devices and the global server as the clients' regularized loss function. A layer-wise knowledge distillation mechanism is introduced to update the local model to exploit the common representation from different layers. Through extensive experiments, we demonstrate that FedLKD outperforms the vanilla FedAvg and FedProx on three federated datasets.

Original languageEnglish
Title of host publication37th International Conference on Information Networking, ICOIN 2023
PublisherIEEE Computer Society
Pages526-529
Number of pages4
ISBN (Electronic)9781665462686
DOIs
Publication statusPublished - 2023
Event37th International Conference on Information Networking, ICOIN 2023 - Bangkok, Thailand
Duration: 11 Jan 202314 Jan 2023

Publication series

NameInternational Conference on Information Networking
Volume2023-January
ISSN (Print)1976-7684

Conference

Conference37th International Conference on Information Networking, ICOIN 2023
Country/TerritoryThailand
CityBangkok
Period11/01/2314/01/23

Bibliographical note

Publisher Copyright:
© 2023 IEEE.

Keywords

  • Federated Learning
  • Knowledge Distillation

Fingerprint

Dive into the research topics of 'Layer-wise Knowledge Distillation for Cross-Device Federated Learning'. Together they form a unique fingerprint.

Cite this