Effectiveness of Model and Data Scale Contrastive Learning in Non-IID Federated Learning

Girum Fitihamlak Ejigu, Ye Lin Tun, Apurba Adhikary, Sun Moo Kang, Choong Seon Hong

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Citation (Scopus)

Abstract

All involved clients are guaranteed data privacy in a collaborative machine learning environment via Federated Learning. The lack of generalization in local client models brought on by data heterogeneity, however, is one of Federated Learning 's major challenges that lead to slow model convergence and communication latency. In this study, we use contrastive learning techniques that have been shown effective in both centralized and federated learning environments. For local models to have a stronger capacity for generalization, we suggest adopting contrastive loss at the model and data scales. Utilizing the CIFAR10 and CIFAR-100 datasets, we assess and contrast our suggested strategy with other industry standard approaches.

Original languageEnglish
Title of host publicationProceedings - 16th International Conference on Advanced Technologies for Communications, ATC 2023
EditorsTran The Son
PublisherIEEE Computer Society
Pages301-304
Number of pages4
ISBN (Electronic)9798350301328
DOIs
Publication statusPublished - 2023
Event16th International Conference on Advanced Technologies for Communications, ATC 2023 - Da Nang, Viet Nam
Duration: 19 Oct 202321 Oct 2023

Publication series

NameInternational Conference on Advanced Technologies for Communications
ISSN (Print)2162-1039
ISSN (Electronic)2162-1020

Conference

Conference16th International Conference on Advanced Technologies for Communications, ATC 2023
Country/TerritoryViet Nam
CityDa Nang
Period19/10/2321/10/23

Bibliographical note

Publisher Copyright:
© 2023 IEEE.

Keywords

  • Federated learning
  • contrastive learning
  • federated optimization
  • non-IID

Fingerprint

Dive into the research topics of 'Effectiveness of Model and Data Scale Contrastive Learning in Non-IID Federated Learning'. Together they form a unique fingerprint.

Cite this