Optimizing Multi-User Semantic Communication via Transfer Learning and Knowledge Distillation

Loc X. Nguyen, Kitae Kim, Ye Lin Tun, Sheikh Salman Hassan, Yan Kyaw Tun, Zhu Han, Choong Seon Hong

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Semantic Communication (SemCom), notable for ensuring quality of service by jointly optimizing source and channel coding, effectively extracts data semantics, eliminates redundant information, and mitigates noise effects from wireless channel. However, most studies overlook multiple user scenarios and resource availability, limiting real-world applications. This letter addresses this gap by focusing on downlink communication from a base station to multiple users with varying computing capacities. Users employ variants of Swin transformer models for source decoding and a simple architecture for channel decoding. We propose a novel training procedure FRENCA, incorporating transfer learning and knowledge distillation to improve low-computing users' performance. Extensive simulations validate the proposed methods.

Original languageEnglish
Pages (from-to)90-94
Number of pages5
JournalIEEE Communications Letters
Volume29
Issue number1
DOIs
Publication statusPublished - 2025

Bibliographical note

Publisher Copyright:
© 1997-2012 IEEE.

Keywords

  • Multiple users in SemCom
  • joint source-channel coding
  • knowledge distillation
  • transfer learning

Fingerprint

Dive into the research topics of 'Optimizing Multi-User Semantic Communication via Transfer Learning and Knowledge Distillation'. Together they form a unique fingerprint.

Cite this