Iterative Semi-Supervised Learning Using Softmax Probability

Heewon Chung, Jinseok Lee

Research output: Contribution to journalArticlepeer-review


For the classification problem in practice, one of the challenging issues is to obtain enough labeled data for training. Moreover, even if such labeled data has been sufficiently accumulated, most datasets often exhibit long-tailed distribution with heavy class imbalance, which results in a biased model towards a majority class. To alleviate such class imbalance, semi-supervised learning methods using additional unlabeled data have been considered. However, as a matter of course, the accuracy is much lower than that from supervised learning. In this study, under the assumption that additional unlabeled data is available, we propose the iterative semi-supervised learning algorithms, which iteratively correct the labeling of the extra unlabeled data based on softmax probabilities. The results show that the proposed algorithms provide the accuracy as high as that from the supervised learning. To validate the proposed algorithms, we tested on the two scenarios: with the balanced unlabeled dataset and with the imbalanced unlabeled dataset. Under both scenarios, our proposed semi-supervised learning algorithms provided higher accuracy than previous state-of-the-arts.

Original languageEnglish
Pages (from-to)5607-5628
Number of pages22
JournalComputers, Materials and Continua
Issue number3
Publication statusPublished - 2022


  • Semi-supervised learning
  • class imbalance
  • iterative learning
  • unlabeled data


Dive into the research topics of 'Iterative Semi-Supervised Learning Using Softmax Probability'. Together they form a unique fingerprint.

Cite this