论文标题

一种用于最小化的子空间加速度方法,涉及组稀疏性诱导正规器

A Subspace Acceleration Method for Minimization Involving a Group Sparsity-Inducing Regularizer

论文作者

Curtis, Frank E., Dai, Yutong, Robinson, Daniel P.

论文摘要

我们考虑最小化目标函数的问题,即凸函数和组稀疏诱导的正常化程序的总和。集成此类正规化器的问题出现在现代机器学习应用中,通常是为了获得易于解释且具有更高预测准确性的模型。我们提出了一种解决此类问题的新方法,这些方法利用子空间加速度,域分解和支持识别。我们的分析表明,在常见的假设下,我们的框架产生的迭代顺序是全球收敛的,收敛到$ε$ - 适当的解决方案最多,最多是$ o(ε^{ - (1+p)})$(分别是$ o($ o(ε^{ - (2+p)})的$ $ y $ $ yther $ thor $ thos $ thos $ thore $ those $ those $ thore+yesive+diend yesive+diend yesive+dibled(yest)。 0 $是算法参数,并展示超级线性局部收敛。基于正规逻辑回归的二进制分类任务的初步数值结果表明,我们的方法是有效且稳健的,并且能够胜过最先进的方法。

We consider the problem of minimizing an objective function that is the sum of a convex function and a group sparsity-inducing regularizer. Problems that integrate such regularizers arise in modern machine learning applications, often for the purpose of obtaining models that are easier to interpret and that have higher predictive accuracy. We present a new method for solving such problems that utilize subspace acceleration, domain decomposition, and support identification. Our analysis shows, under common assumptions, that the iterate sequence generated by our framework is globally convergent, converges to an $ε$-approximate solution in at most $O(ε^{-(1+p)})$ (respectively, $O(ε^{-(2+p)})$) iterations for all $ε$ bounded above and large enough (respectively, all $ε$ bounded above) where $p > 0$ is an algorithm parameter, and exhibits superlinear local convergence. Preliminary numerical results for the task of binary classification based on regularized logistic regression show that our approach is efficient and robust, with the ability to outperform a state-of-the-art method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源