论文标题

多任务内核学习参数预测方法,用于求解时间依赖性线性系统

Multitask kernel-learning parameter prediction method for solving time-dependent linear systems

论文作者

Jiang, Kai, Zhang, Juan, Zhou, Qi

论文摘要

矩阵分裂迭代方法在解决大型稀疏线性系统中起着至关重要的作用。它们的性能在很大程度上取决于分裂参数,但是,选择最佳分裂参数的方法尚未得到很好的开发。在本文中,我们提出了一个多任务内核学习参数预测方法,以自动获得相对最佳的分裂参数,该参数包含同时多个参数预测和数据驱动的内核学习。为了求解时间依赖的线性系统,包括线性差分系统和线性矩阵系统,我们提供了一种新的矩阵分裂Kronecker产品方法及其收敛分析和预处理策略。数值结果说明了我们的方法可以节省大量的时间,以选择与存在方法相比,相对最佳的分裂参数。此外,我们作为预处理的迭代方法可以有效地加速GMRE。随着系统尺寸的增加,我们方法的所有优势变得显着。特别是,对于求解差异sylvester矩阵方程式,当系统尺度大于十万时,加速比可以达到数十亿次。

Matrix splitting iteration methods play a vital role in solving large sparse linear systems. Their performance heavily depends on the splitting parameters, however, the approach of selecting optimal splitting parameters has not been well developed. In this paper, we present a multitask kernel-learning parameter prediction method to automatically obtain relatively optimal splitting parameters, which contains simultaneous multiple parameters prediction and a data-driven kernel learning. For solving time-dependent linear systems, including linear differential systems and linear matrix systems, we give a new matrix splitting Kronecker product method, as well as its convergence analysis and preconditioning strategy. Numerical results illustrate our methods can save an enormous amount of time in selecting the relatively optimal splitting parameters compared with the exists methods. Moreover, our iteration method as a preconditioner can effectively accelerate GMRES. As the dimension of systems increases, all the advantages of our approaches becomes significantly. Especially, for solving the differential Sylvester matrix equation, the speedup ratio can reach tens to hundreds of times when the scale of the system is larger than one hundred thousand.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源