Academy & Industry Research Collaboration Center (AIRCC)

Volume 13, Number 05, March 2023

Addressing Class Variable Imbalance in Federated Semi-Supervised Learning


Zehui Dong, Wenjing Liu, Siyuan Liu and Xingzhi Chen, Inner Mongolia University of Technology, China


Federated Semi-supervised Learning (FSSL) combines techniques from both fields of federated and semi- supervised learning to improve the accuracy and performance of models in a distributed environment by using a small fraction of labeled data and a large amount of unlabeled data. Without the need tocentralize all data in one place for training, it collect updates of model training after devices train models at local, and thus can protect the privacy of user data. However, during the federal training process, some of the devices fail to collect enough data for local training, while new devices will be included to the group training. This leads to an unbalanced global data distribution and thus af ect the performance of the global model training. Most of the current research is focusing on class imbalance with a fixednumber of classes, while little attention is paid to data imbalance with a variable number of classes. Therefore, in this paper, we propose Federated Semi-supervised Learning for Class Variable Imbalance (FCVI) to solve class variable imbalance. The class-variable learning algorithm is used to mitigate the data imbalance due to changes of the number of classes. Our scheme is proved to be significantly better than baseline methods, while maintaining client privacy.


Federal semi-supervised learning, Federated learning, Semi-supervised learning, Class variable imbalance