论文标题
旋转旋转的几何形状优化的载体
VectorAdam for Rotation Equivariant Geometry Optimization
论文作者
论文摘要
事实证明,亚当优化算法对于在机器学习中的优化问题甚至几何处理中的传统任务都非常有效。同时,事实证明,在旋转或某些其他转换的作用下保留其输出的模化方法的开发已被证明对于这些域中的几何问题很重要。在这项工作中,我们观察到,当将Adam $ - $视为将初始条件映射到优化结果$ - $的函数时,由于每位坐标矩更新而导致矢量值参数不旋转。这在实践中导致了重大的伪影和偏见。我们建议用VectorAdam解决这一缺陷,VectorAdam是一种简单的修改,通过考虑优化变量的向量结构来使Adam旋转量相位量。我们在机器学习和传统的几何优化方面证明了这种方法,表明,当应用于矢量值数据时,Equivariant VectorAdam可以解决传统ADAM的伪像和偏见,并具有等效甚至提高收敛速度。
The Adam optimization algorithm has proven remarkably effective for optimization problems across machine learning and even traditional tasks in geometry processing. At the same time, the development of equivariant methods, which preserve their output under the action of rotation or some other transformation, has proven to be important for geometry problems across these domains. In this work, we observe that Adam $-$ when treated as a function that maps initial conditions to optimized results $-$ is not rotation equivariant for vector-valued parameters due to per-coordinate moment updates. This leads to significant artifacts and biases in practice. We propose to resolve this deficiency with VectorAdam, a simple modification which makes Adam rotation-equivariant by accounting for the vector structure of optimization variables. We demonstrate this approach on problems in machine learning and traditional geometric optimization, showing that equivariant VectorAdam resolves the artifacts and biases of traditional Adam when applied to vector-valued data, with equivalent or even improved rates of convergence.