论文标题
远非渐进
Far from Asymptopia
论文作者
论文摘要
来自有限数据的推论需要对参数空间的度量概念,在贝叶斯框架中最明确的是先验。在这里,我们证明了杰弗里斯(Jeffreys Prior)是最著名的非信息选择,当应用于典型的科学模型时会引入巨大的偏见。这样的模型的有效维度比显微镜参数的数量小得多。由于Jeffreys Prior会平均处理所有显微镜参数,因此由于无关方向的局部共同销量的差异,它将投影到相关参数的子空间上。我们介绍了一种原则性的度量选择,该措施避免了这个问题,从而导致复杂模型的推断无偏见。此最佳先验取决于要收集的数据的数量,并在渐近极限中与Jeffreys进行了先验。但是,如果没有大量数据,这一限制是无数数据的合理性的,这是微观参数数量的指数。
Inference from limited data requires a notion of measure on parameter space, most explicit in the Bayesian framework as a prior. Here we demonstrate that Jeffreys prior, the best-known uninformative choice, introduces enormous bias when applied to typical scientific models. Such models have a relevant effective dimensionality much smaller than the number of microscopic parameters. Because Jeffreys prior treats all microscopic parameters equally, it is from uniform when projected onto the sub-space of relevant parameters, due to variations in the local co-volume of irrelevant directions. We present results on a principled choice of measure which avoids this issue, leading to unbiased inference in complex models. This optimal prior depends on the quantity of data to be gathered, and approaches Jeffreys prior in the asymptotic limit. However, this limit cannot be justified without an impossibly large amount of data, exponential in the number of microscopic parameters.