Abstract
Federated learning (FL) advances the field of distributed machine learning for facilitating the privacy-preserving management of edge devices and central servers. However, the majority of data among edge devices is non-IID (not Independent and Identically Distributed), making it challenging to achieve edge intelligence. In this work, we attempt to mitigate the above issue through knowledge distillation (KD) by sharing knowledge between the central server and edge devices. Specifically, we first investigate where (i.e. which feature layer) to conduct the distillation for knowledge sharing. We find that setting the feature layer to that before the classification head yields superior performance. Moreover, we investigate how to conduct the KD in terms of loss choices. We test various types of losses for enhancing the knowledge sharing and find that Center Kernel Alignment (CKA) achieves the best performance among the investigated loss metrics. Overall, this work sheds new light on where and how to perform KD in FL. Experimental results on MNIST and Fashion-MNIST demonstrate that our finding yields a performance gain of at least 4%.
Original language | English |
---|---|
Title of host publication | APNOMS 2023 - 24th Asia-Pacific Network Operations and Management Symposium |
Subtitle of host publication | Intelligent Management for Enabling the Digital Transformation |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 18-23 |
Number of pages | 6 |
ISBN (Electronic) | 9788995004395 |
Publication status | Published - 2023 |
Event | 24th Asia-Pacific Network Operations and Management Symposium, APNOMS 2023 - Sejong, Korea, Republic of Duration: 6 Sept 2023 → 8 Sept 2023 |
Publication series
Name | APNOMS 2023 - 24th Asia-Pacific Network Operations and Management Symposium: Intelligent Management for Enabling the Digital Transformation |
---|
Conference
Conference | 24th Asia-Pacific Network Operations and Management Symposium, APNOMS 2023 |
---|---|
Country/Territory | Korea, Republic of |
City | Sejong |
Period | 6/09/23 → 8/09/23 |
Bibliographical note
Publisher Copyright:Copyright 2023 KICS.
Keywords
- Edge intelligence
- communication efficiency
- federated learning
- knowledge distillation