Knowledge Distillation Assisted Robust Federated Learning: Towards Edge Intelligence

Yu Qiao, Apurba Adhikary, Ki Tae Kim, Chaoning Zhang, Choong Seon Hong

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

3 Citations (Scopus)

Abstract

Federated learning (FL) makes it possible to advance towards edge intelligence by enabling collaborative and privacy-preserving model training across distributed edge devices. One of the main challenges in FL is non-IID (not Independent and Identically Distributed) nature of data distribution across edge devices, which results in inconsistent update directions of local and global models, thus hindering model convergence. Moreover, recent studies have shown that FL models can significantly degrade performance under adversarial attacks, which further poses challenges for deployment at edge sides. In this work, we attempt to improve the robustness of FL model under adversarial attacks in non-IID settings by sharing knowledge between a central server and edge devices via knowledge distillation. Specifically, we propose a new knowledge distillation-based federated adversarial training (FAT) framework, termed FedAdv (Federated Adversarial), which involves an edge server collecting global prototypes by aggregating local prototypes obtained from participating devices after adversarial training (AT). These global prototypes are subsequently distributed to the edge devices for regularization. This regularization mechanism aims to encourage each device to align its local representation with the corresponding global prototype. By doing so, it helps prevent significant deviations of local model updates from the global model. Experimental results on MNIST and Fashion-MNIST show that our strategy yields comparable or superior performance gains in both natural and robust accuracy compared to several baselines.

Original languageEnglish
Title of host publicationICC 2024 - IEEE International Conference on Communications
EditorsMatthew Valenti, David Reed, Melissa Torres
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages843-848
Number of pages6
ISBN (Electronic)9781728190549
DOIs
Publication statusPublished - 2024
Event59th Annual IEEE International Conference on Communications, ICC 2024 - Denver, United States
Duration: 9 Jun 202413 Jun 2024

Publication series

NameIEEE International Conference on Communications
ISSN (Print)1550-3607

Conference

Conference59th Annual IEEE International Conference on Communications, ICC 2024
Country/TerritoryUnited States
CityDenver
Period9/06/2413/06/24

Bibliographical note

Publisher Copyright:
© 2024 IEEE.

Keywords

  • adversarial attack
  • edge intelligence
  • Federated learning
  • knowledge distillation
  • non-IID

Fingerprint

Dive into the research topics of 'Knowledge Distillation Assisted Robust Federated Learning: Towards Edge Intelligence'. Together they form a unique fingerprint.

Cite this