论文标题

动量加速进化动力学

Momentum Accelerates Evolutionary Dynamics

论文作者

Harper, Marc, Safyan, Joshua

论文摘要

我们将机器学习的动量与进化动力学相结合,可以将动量视为代际记忆的简单机制。使用信息差异作为Lyapunov函数,我们表明,动量加速了进化动力学的收敛性,包括复制器方程和种群上的欧几里得梯度下降。当存在进化稳定的状态时,这些方法证明了小学习率或较小动量的收敛性,并得出分析性确定与计算相符的时间相对减少的相对减少。即使进化动态不是梯度流,主要结果即使是适用的。我们还表明,动量可以改变这些动力学的收敛特性,例如,通过与岩纸剪辑局部景观相关的循环,从而导致融合到通常不吸收的平衡或发散,具体取决于动量的价值和机制。

We combine momentum from machine learning with evolutionary dynamics, where momentum can be viewed as a simple mechanism of intergenerational memory. Using information divergences as Lyapunov functions, we show that momentum accelerates the convergence of evolutionary dynamics including the replicator equation and Euclidean gradient descent on populations. When evolutionarily stable states are present, these methods prove convergence for small learning rates or small momentum, and yield an analytic determination of the relative decrease in time to converge that agrees well with computations. The main results apply even when the evolutionary dynamic is not a gradient flow. We also show that momentum can alter the convergence properties of these dynamics, for example by breaking the cycling associated to the rock-paper-scissors landscape, leading to either convergence to the ordinarily non-absorbing equilibrium, or divergence, depending on the value and mechanism of momentum.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源