论文标题

分数趋势过滤的风险范围

Risk Bounds for Quantile Trend Filtering

论文作者

Padilla, Oscar Hernan Madrid, Chatterjee, Sabyasachi

论文摘要

我们研究了分位数趋势过滤,这是一种非参数分位数回归的方法,其目的是概括执行平均回归的通常趋势滤波估计器已知的现有风险界限。我们研究单变量趋势滤波的惩罚和约束版本($ r \ geq 1 $)。我们的结果表明,当$(r-1)$(R-1)$ TH的量化量属性属性属性属性属性属于有限变化信号类别时,受约束和惩罚版本($ r \ geq 1 $)达到了最低速率。此外,我们还表明,如果分位数的真实矢量是具有几个多项式零件的离散样条,那么两个版本都达到了接收的近乎参数速率。已知仅在误差是高斯时,已知通常的趋势过滤估计器的相应结果才能成立。相比之下,我们的风险范围在对误差变量的最小假设下表现出来。特别是,不需要时刻的假设,我们的结果在重尾错误下。我们的证明技术是一般的,因此有可能用于研究其他非参数分位回归方法。为了说明这种通用性,我们还采用了证明技术来获得多变量分位数总变异和高维分位数线性回归的新结果。

We study quantile trend filtering, a recently proposed method for nonparametric quantile regression with the goal of generalizing existing risk bounds known for the usual trend filtering estimators which perform mean regression. We study both the penalized and the constrained version (of order $r \geq 1$) of univariate quantile trend filtering. Our results show that both the constrained and the penalized version (of order $r \geq 1$) attain the minimax rate up to log factors, when the $(r-1)$th discrete derivative of the true vector of quantiles belongs to the class of bounded variation signals. Moreover we also show that if the true vector of quantiles is a discrete spline with a few polynomial pieces then both versions attain a near parametric rate of convergence. Corresponding results for the usual trend filtering estimators are known to hold only when the errors are sub-Gaussian. In contrast, our risk bounds are shown to hold under minimal assumptions on the error variables. In particular, no moment assumptions are needed and our results hold under heavy-tailed errors. Our proof techniques are general and thus can potentially be used to study other nonparametric quantile regression methods. To illustrate this generality we also employ our proof techniques to obtain new results for multivariate quantile total variation denoising and high dimensional quantile linear regression.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源