Abstract
All involved clients are guaranteed data privacy in a collaborative machine learning environment via Federated Learning. The lack of generalization in local client models brought on by data heterogeneity, however, is one of Federated Learning 's major challenges that lead to slow model convergence and communication latency. In this study, we use contrastive learning techniques that have been shown effective in both centralized and federated learning environments. For local models to have a stronger capacity for generalization, we suggest adopting contrastive loss at the model and data scales. Utilizing the CIFAR10 and CIFAR-100 datasets, we assess and contrast our suggested strategy with other industry standard approaches.
Original language | English |
---|---|
Title of host publication | Proceedings - 16th International Conference on Advanced Technologies for Communications, ATC 2023 |
Editors | Tran The Son |
Publisher | IEEE Computer Society |
Pages | 301-304 |
Number of pages | 4 |
ISBN (Electronic) | 9798350301328 |
DOIs | |
Publication status | Published - 2023 |
Event | 16th International Conference on Advanced Technologies for Communications, ATC 2023 - Da Nang, Viet Nam Duration: 19 Oct 2023 → 21 Oct 2023 |
Publication series
Name | International Conference on Advanced Technologies for Communications |
---|---|
ISSN (Print) | 2162-1039 |
ISSN (Electronic) | 2162-1020 |
Conference
Conference | 16th International Conference on Advanced Technologies for Communications, ATC 2023 |
---|---|
Country/Territory | Viet Nam |
City | Da Nang |
Period | 19/10/23 → 21/10/23 |
Bibliographical note
Publisher Copyright:© 2023 IEEE.
Keywords
- Federated learning
- contrastive learning
- federated optimization
- non-IID