Knowledge Distillation in Federated Learning: Where and How to Distill?

Yu Qiao, Chaoning Zhang, Huy Q. Le, Avi Deb Raha, Apurba Adhikary, Choong Seon Hong

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

4 Citations (Scopus)

Abstract

Federated learning (FL) advances the field of distributed machine learning for facilitating the privacy-preserving management of edge devices and central servers. However, the majority of data among edge devices is non-IID (not Independent and Identically Distributed), making it challenging to achieve edge intelligence. In this work, we attempt to mitigate the above issue through knowledge distillation (KD) by sharing knowledge between the central server and edge devices. Specifically, we first investigate where (i.e. which feature layer) to conduct the distillation for knowledge sharing. We find that setting the feature layer to that before the classification head yields superior performance. Moreover, we investigate how to conduct the KD in terms of loss choices. We test various types of losses for enhancing the knowledge sharing and find that Center Kernel Alignment (CKA) achieves the best performance among the investigated loss metrics. Overall, this work sheds new light on where and how to perform KD in FL. Experimental results on MNIST and Fashion-MNIST demonstrate that our finding yields a performance gain of at least 4%.

Original languageEnglish
Title of host publicationAPNOMS 2023 - 24th Asia-Pacific Network Operations and Management Symposium
Subtitle of host publicationIntelligent Management for Enabling the Digital Transformation
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages18-23
Number of pages6
ISBN (Electronic)9788995004395
Publication statusPublished - 2023
Event24th Asia-Pacific Network Operations and Management Symposium, APNOMS 2023 - Sejong, Korea, Republic of
Duration: 6 Sept 20238 Sept 2023

Publication series

NameAPNOMS 2023 - 24th Asia-Pacific Network Operations and Management Symposium: Intelligent Management for Enabling the Digital Transformation

Conference

Conference24th Asia-Pacific Network Operations and Management Symposium, APNOMS 2023
Country/TerritoryKorea, Republic of
CitySejong
Period6/09/238/09/23

Bibliographical note

Publisher Copyright:
Copyright 2023 KICS.

Keywords

  • Edge intelligence
  • communication efficiency
  • federated learning
  • knowledge distillation

Fingerprint

Dive into the research topics of 'Knowledge Distillation in Federated Learning: Where and How to Distill?'. Together they form a unique fingerprint.

Cite this