Abstract
Federated Learning paradigm ensures basic data privacy of local clients through an iterative aggregation of model parameters. The success of a global model in federated learning depends on local models that are trained on self-labeled client data. However, all participating clients have their own personal bias and different expertise level that leads to label noise. Hence, a federated learning model should be robust to noise and deliver consistent output. To deal with this issue, we here propose a robust federated learning approach that focuses on a local model training phase of clients. We simultaneously train two deep networks using normal and augmented inputs and mix up their predicted classes to minimize entropy before using a noise tolerant loss function. Further, we add a simple knowledge distillation technique to enhance the performance of the network. We test our proposed method with CIFAR-10 and Fashion-MNIST datasets in both IID and non-IID data distribution settings to showcase its robustness to noise.
Original language | English |
---|---|
Title of host publication | 37th International Conference on Information Networking, ICOIN 2023 |
Publisher | IEEE Computer Society |
Pages | 277-281 |
Number of pages | 5 |
ISBN (Electronic) | 9781665462686 |
DOIs | |
Publication status | Published - 2023 |
Event | 37th International Conference on Information Networking, ICOIN 2023 - Bangkok, Thailand Duration: 11 Jan 2023 → 14 Jan 2023 |
Publication series
Name | International Conference on Information Networking |
---|---|
Volume | 2023-January |
ISSN (Print) | 1976-7684 |
Conference
Conference | 37th International Conference on Information Networking, ICOIN 2023 |
---|---|
Country/Territory | Thailand |
City | Bangkok |
Period | 11/01/23 → 14/01/23 |
Bibliographical note
Publisher Copyright:© 2023 IEEE.
Keywords
- federated learning
- noisy labels
- robust federated learning