论文标题
针对受约束的压缩传感模型的双重迭代重新加权算法
Doubly iteratively reweighted algorithm for constrained compressed sensing models
论文作者
论文摘要
我们为受约束的压缩感测模型提出了一种新的算法框架,该框架吸收了非凸刺激性诱导的正规化器,包括log-penalty函数作为目标,以及cauchy损失函数和诸如cauchy损失函数和tukey Biaplageight Biageight损失函数之类的非convex损失函数。我们的框架采用迭代重新加权的$ \ ell_1 $和$ \ ell_2 $方案来构建子问题,这些子问题可以通过发达的求解器有效地解决,用于基础追求,例如SPGL1 [6]。我们为子问题求解器提出了一个新的终止标准,使他们能够返回一个不可行的解决方案,并具有适当构造的可行点,满足下降状态。可行的点构造步骤是确定我们提出的算法定义明确的关键,我们还证明,在合适的假设下,这种可行点的任何积累点都是约束压缩传感模型的固定点。最后,我们以数值方式比较了我们的算法(使用SPGL1或乘数的交替方向方法与SCP $ _ {\ rm ls} $相比,[\ rm ls} $在求解受约束的压缩感应模型的求解中,其目标是目标和cauchy损失的功能。我们的计算结果表明,我们的方法使用更好的恢复错误返回解决方案,并且始终更快。
We propose a new algorithmic framework for constrained compressed sensing models that admit nonconvex sparsity-inducing regularizers including the log-penalty function as objectives, and nonconvex loss functions such as the Cauchy loss function and the Tukey biweight loss function in the constraint. Our framework employs iteratively reweighted $\ell_1$ and $\ell_2$ schemes to construct subproblems that can be efficiently solved by well-developed solvers for basis pursuit denoising such as SPGL1 [6]. We propose a new termination criterion for the subproblem solvers that allows them to return an infeasible solution, with a suitably constructed feasible point satisfying a descent condition. The feasible point construction step is the key for establishing the well-definedness of our proposed algorithm, and we also prove that any accumulation point of this sequence of feasible points is a stationary point of the constrained compressed sensing model, under suitable assumptions. Finally, we compare numerically our algorithm (with subproblems solved by SPGL1 or the alternating direction method of multipliers) against the SCP$_{\rm ls}$ in [41] on solving constrained compressed sensing models with the log-penalty function as the objective and the Cauchy loss function in the constraint, for badly-scaled measurement matrices. Our computational results show that our approaches return solutions with better recovery errors, and are always faster.