论文标题

可鲁棒的正规化低级矩阵模型用于回归和分类

Robust Regularized Low-Rank Matrix Models for Regression and Classification

论文作者

Huang, Hsin-Hsiung, Yu, Feng, Fan, Xing, Zhang, Teng

论文摘要

尽管已经在许多现有作品中研究了矩阵变量回归模型,但用于分析回归系数估计的经典统计和计算方法受高维和嘈杂的矩阵值值预测变量的高度影响。为了解决这些问题,本文提出了基于等级约束,矢量正则化(例如稀疏性)和一般损失函数的矩阵变量回归模型的框架,并考虑了三种特殊情况:普通矩阵回归,鲁棒矩阵回归和矩阵逻辑回归。我们还提出了一种交替的投影梯度下降算法。基于分析我们对具有有界曲率的歧管的目标功能,我们表明算法保证会收敛,迭代元素的所有积累点都以$ O(1/\ sqrt {n} {n})$的顺序估计误差,并实质性地达到了最小值。我们的理论分析可以应用于具有有界曲率的流形的一般优化问题,并且可以被视为对这项工作的重要技术贡献。我们通过模拟研究和真实图像数据示例验证了提出的方法。

While matrix variate regression models have been studied in many existing works, classical statistical and computational methods for the analysis of the regression coefficient estimation are highly affected by high dimensional and noisy matrix-valued predictors. To address these issues, this paper proposes a framework of matrix variate regression models based on a rank constraint, vector regularization (e.g., sparsity), and a general loss function with three special cases considered: ordinary matrix regression, robust matrix regression, and matrix logistic regression. We also propose an alternating projected gradient descent algorithm. Based on analyzing our objective functions on manifolds with bounded curvature, we show that the algorithm is guaranteed to converge, all accumulation points of the iterates have estimation errors in the order of $O(1/\sqrt{n})$ asymptotically and substantially attaining the minimax rate. Our theoretical analysis can be applied to general optimization problems on manifolds with bounded curvature and can be considered an important technical contribution to this work. We validate the proposed method through simulation studies and real image data examples.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源