论文标题
通过自适应优化解决客户持续学习中的客户漂移
Addressing Client Drift in Federated Continual Learning with Adaptive Optimization
论文作者
论文摘要
联合学习已经进行了广泛的研究,并且是在边缘设备中提供隐私分布式学习的普遍方法。相应地,持续学习是一个针对依次学习多个任务的新兴领域。但是,当在持续学习系统中执行联合聚合时,几乎没有关注其他挑战。我们将\ textIt {client draft}识别为当在这样的系统中应用平均的香草平均时出现的关键弱点之一,尤其是因为每个客户端可以独立地具有不同的任务顺序。我们概述了通过使用Nettailor作为候选人持续学习方法来执行联合持续学习(FCL)的框架,并显示客户漂移问题的程度。我们表明,自适应联合优化可以减少客户漂移的不利影响,并展示其对CIFAR100,Miniimagenet和Decathlon基准的有效性。此外,我们提供了一项经验分析,突出了不同的超参数(例如客户端和服务器学习率,本地培训迭代次数和通信回合的数量)之间的相互作用。最后,我们评估了联合学习系统的有用特征,例如可伸缩性,对客户数据分布的偏斜性的稳健性和散落者。
Federated learning has been extensively studied and is the prevalent method for privacy-preserving distributed learning in edge devices. Correspondingly, continual learning is an emerging field targeted towards learning multiple tasks sequentially. However, there is little attention towards additional challenges emerging when federated aggregation is performed in a continual learning system. We identify \textit{client drift} as one of the key weaknesses that arise when vanilla federated averaging is applied in such a system, especially since each client can independently have different order of tasks. We outline a framework for performing Federated Continual Learning (FCL) by using NetTailor as a candidate continual learning approach and show the extent of the problem of client drift. We show that adaptive federated optimization can reduce the adverse impact of client drift and showcase its effectiveness on CIFAR100, MiniImagenet, and Decathlon benchmarks. Further, we provide an empirical analysis highlighting the interplay between different hyperparameters such as client and server learning rates, the number of local training iterations, and communication rounds. Finally, we evaluate our framework on useful characteristics of federated learning systems such as scalability, robustness to the skewness in clients' data distribution, and stragglers.