论文标题

非线性通过随机神经网络降低了足够的尺寸

Nonlinear Sufficient Dimension Reduction with a Stochastic Neural Network

论文作者

Liang, Siqi, Sun, Yan, Liang, Faming

论文摘要

足够的尺寸缩小是提取隐藏在高维数据中的核心信息的强大工具,并且在机器学习任务中可能具有许多重要的应用程序。但是,现有的非线性缩小方法通常缺乏处理大规模数据所需的可扩展性。我们在严格的概率框架下提出了一种新型的随机神经网络,并表明它可用于大规模数据的足够尺寸。提出的随机神经网络是使用自适应随机梯度马尔可夫链蒙特卡洛算法训练的,该算法也在论文中严格研究了其收敛性。通过对现实世界分类和回归问题的广泛实验,我们表明该方法与现有的最新降维方法有利相比,并且在计算上对于大规模数据的计算效率更高。

Sufficient dimension reduction is a powerful tool to extract core information hidden in the high-dimensional data and has potentially many important applications in machine learning tasks. However, the existing nonlinear sufficient dimension reduction methods often lack the scalability necessary for dealing with large-scale data. We propose a new type of stochastic neural network under a rigorous probabilistic framework and show that it can be used for sufficient dimension reduction for large-scale data. The proposed stochastic neural network is trained using an adaptive stochastic gradient Markov chain Monte Carlo algorithm, whose convergence is rigorously studied in the paper as well. Through extensive experiments on real-world classification and regression problems, we show that the proposed method compares favorably with the existing state-of-the-art sufficient dimension reduction methods and is computationally more efficient for large-scale data.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源