论文标题

主要成分分析的完全对称学习规则的收敛速度提高了

Improved Convergence Speed of Fully Symmetric Learning Rules for Principal Component Analysis

论文作者

Möller, Ralf

论文摘要

主成分分析的完全对称学习规则可以从我们以前的工作中提出的新的目标函数得出。我们观察到,这些学习规则在协方差矩阵中遭受缓慢的融合,其中一些主要特征值彼此接近。在这里,我们描述了一个修改后的目标函数,其附加术语可以减轻这种收敛问题。我们表明,从修改后的目标函数得出的学习规则从原始学习规则继承了所有固定点(但可能会引入其他固定点)。同样,继承的固定点的稳定性保持不变。仅在某些方向上,目标函数的陡度就会增加。模拟确认,取决于附加项的重量因素,可以明显提高收敛速度。

Fully symmetric learning rules for principal component analysis can be derived from a novel objective function suggested in our previous work. We observed that these learning rules suffer from slow convergence for covariance matrices where some principal eigenvalues are close to each other. Here we describe a modified objective function with an additional term which mitigates this convergence problem. We show that the learning rule derived from the modified objective function inherits all fixed points from the original learning rule (but may introduce additional ones). Also the stability of the inherited fixed points remains unchanged. Only the steepness of the objective function is increased in some directions. Simulations confirm that the convergence speed can be noticeably improved, depending on the weight factor of the additional term.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源