论文标题

分散的加速梯度下降

Multi-consensus Decentralized Accelerated Gradient Descent

论文作者

Ye, Haishan, Luo, Luo, Zhou, Ziang, Zhang, Tong

论文摘要

本文考虑了分散的凸优化问题,该问题在大规模的机器学习,传感器网络和控制理论中具有广泛的应用。我们提出了实现最佳计算复杂性和接近最佳通信复杂性的新型算法。我们的理论结果为开放问题提供了肯定的答案,即是否存在一种算法,该算法可以达到通信复杂性(几乎)匹配下限的算法,具体取决于全球条件数而不是本地条件。此外,我们的算法的线性收敛仅取决于全局目标的强凸度,并且它确实需要\ emph {not}需要局部函数。我们方法的设计取决于众所周知的技术的新型整合,包括Nesterov的加速度,多声音和梯度跟踪。经验研究表明,我们用于机器学习应用的方法的表现要多。

This paper considers the decentralized convex optimization problem, which has a wide range of applications in large-scale machine learning, sensor networks, and control theory. We propose novel algorithms that achieve optimal computation complexity and near optimal communication complexity. Our theoretical results give affirmative answers to the open problem on whether there exists an algorithm that can achieve a communication complexity (nearly) matching the lower bound depending on the global condition number instead of the local one. Furthermore, the linear convergence of our algorithms only depends on the strong convexity of global objective and it does \emph{not} require the local functions to be convex. The design of our methods relies on a novel integration of well-known techniques including Nesterov's acceleration, multi-consensus and gradient-tracking. Empirical studies show the outperformance of our methods for machine learning applications.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源