论文标题

Fisher信息采样的下限

Fisher information lower bounds for sampling

论文作者

Chewi, Sinho, Gerber, Patrik, Lee, Holden, Lu, Chen

论文摘要

我们证明了在Balasubramanian等人的框架内的非Log-Concave采样的复杂性的两个下限。 (2022),他引入了Fisher信息(FI)边界作为抽样中近似一阶平稳性的概念。我们的第一个下限表明,通过减少在非凸优化中找到固定点的问题,平均LMC对大型FI的制度是最佳的。 Our second lower bound shows that in the regime of small FI, obtaining a FI of at most $\varepsilon^2$ from the target distribution requires $\text{poly}(1/\varepsilon)$ queries, which is surprising as it rules out the existence of high-accuracy algorithms (e.g., algorithms using Metropolis-Hastings filters) in this context.

We prove two lower bounds for the complexity of non-log-concave sampling within the framework of Balasubramanian et al. (2022), who introduced the use of Fisher information (FI) bounds as a notion of approximate first-order stationarity in sampling. Our first lower bound shows that averaged LMC is optimal for the regime of large FI by reducing the problem of finding stationary points in non-convex optimization to sampling. Our second lower bound shows that in the regime of small FI, obtaining a FI of at most $\varepsilon^2$ from the target distribution requires $\text{poly}(1/\varepsilon)$ queries, which is surprising as it rules out the existence of high-accuracy algorithms (e.g., algorithms using Metropolis-Hastings filters) in this context.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源