We use cookies to improve your experience with our site.
Zhi-Yuan Wu, Tian-Liu He, Sheng Sun, Yu-Wei Wang, Min Liu, Bo Gao, Xue-Feng Jiang. Federated Class-Incremental Learning with New-Class Augmented Self-Distillation[J]. Journal of Computer Science and Technology. DOI: 10.1007/s11390-025-5186-5
Citation: Zhi-Yuan Wu, Tian-Liu He, Sheng Sun, Yu-Wei Wang, Min Liu, Bo Gao, Xue-Feng Jiang. Federated Class-Incremental Learning with New-Class Augmented Self-Distillation[J]. Journal of Computer Science and Technology. DOI: 10.1007/s11390-025-5186-5

Federated Class-Incremental Learning with New-Class Augmented Self-Distillation

  • Federated Learning (FL) enables collaborative model training among participants while guaranteeing the privacy of raw data. Mainstream FL methodologies overlook the dynamic nature of real-world data, particularly its tendency to grow in volume and diversify in classes over time. This oversight results in FL methods suffering from catastrophic forgetting, where the trained models inadvertently discard previously learned information upon assimilating new data. In response to this challenge, we propose a novel Federated Class-Incremental Learning (FCIL) method, named FederatedClass-Incremental Learning with New-Class Augmented Self-DiStillation (FedCLASS). The core of FedCLASS is to enrich the class scores of historical models with new class scores predicted by current models and utilize the combined knowledge for self-distillation, enabling a more sufficient and precise knowledge transfer from historical models to current models. Theoretical analyses demonstrate that FedCLASS stands on reliable foundations, considering scores of old classes predicted by historical models as conditional probabilities in the absence of new classes, and the scores of new classes predicted by current models as the conditional probabilities of class scores derived from historical models. Empirical experiments demonstrate the superiority of FedCLASS over four baseline algorithms in reducing average forgetting rate and boosting global accuracy.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return