论文标题

使用非IID数据的标签驱动的知识蒸馏用于联合学习

Label driven Knowledge Distillation for Federated Learning with non-IID Data

论文作者

Nguyen, Minh-Duong, Pham, Quoc-Viet, Hoang, Dinh Thai, Tran-Thanh, Long, Nguyen, Diep N., Hwang, Won-Joo

论文摘要

在实际应用程序中,联合学习(FL)遇到了两个挑战:(1)可伸缩性,尤其是应用于大型物联网网络时; (2)如何使用异质数据对环境保持鲁棒性。意识到第一个问题,我们旨在设计一个名为Full-Stack FL(F2L)的新型FL框架。更具体地说,F2L使用层次结构架构,使扩展FL网络可以访问而无需重建整个网络系统。此外,利用层次网络设计的优势,我们在全球服务器上提出了一种新的标签驱动的知识蒸馏(LKD)技术来解决第二个问题。与当前的知识蒸馏技术相反,LKD能够训练学生模型,该模型由所有教师模型的良好知识组成。因此,我们提出的算法可以有效地提取区域数据分布(即区域汇总模型)的知识,以减少客户在使用非独立分布的数据下运行的FL系统下操作时客户模型之间的差异。广泛的实验结果表明:(i)我们的F2L方法可以显着提高所有全球蒸馏的总体FL效率,并且(ii)F2L随着全球蒸馏阶段的发生而迅速达到收敛性,而不是在每个通信周期中提高。

In real-world applications, Federated Learning (FL) meets two challenges: (1) scalability, especially when applied to massive IoT networks; and (2) how to be robust against an environment with heterogeneous data. Realizing the first problem, we aim to design a novel FL framework named Full-stack FL (F2L). More specifically, F2L utilizes a hierarchical network architecture, making extending the FL network accessible without reconstructing the whole network system. Moreover, leveraging the advantages of hierarchical network design, we propose a new label-driven knowledge distillation (LKD) technique at the global server to address the second problem. As opposed to current knowledge distillation techniques, LKD is capable of training a student model, which consists of good knowledge from all teachers' models. Therefore, our proposed algorithm can effectively extract the knowledge of the regions' data distribution (i.e., the regional aggregated models) to reduce the divergence between clients' models when operating under the FL system with non-independent identically distributed data. Extensive experiment results reveal that: (i) our F2L method can significantly improve the overall FL efficiency in all global distillations, and (ii) F2L rapidly achieves convergence as global distillation stages occur instead of increasing on each communication cycle.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源