论文标题
从带有HAT EBM的发电机潜在空间中学习概率模型
Learning Probabilistic Models from Generator Latent Spaces with Hat EBM
论文作者
论文摘要
这项工作提出了一种使用任何发电机网络作为基于能量模型(EBM)的基础的方法。我们的公式认为,观察到的图像是通过发电机网络传递的未观察到的潜在变量的总和,以及一个跨越生成器输出和图像歧管之间差距的残留随机变量。然后,可以定义一个EBM,该EBM包括发电机作为其正向通行证的一部分,我们称之为帽子EBM。可以训练该模型,而无需推断观察到的数据的潜在变量或计算发生器Jacobian决定因素。这可以实现任何类型的发电机网络的输出分布的明确概率建模。实验显示了(1)在128x128分辨率的无条件成像网合成上提出的方法的强烈性能,(2)完善现有发电机的输出,以及(3)学习结合了非稳定发生器的EBM。代码和预估计的模型以复制我们的结果,请访问https://github.com/point0bar1/hat-ebm。
This work proposes a method for using any generator network as the foundation of an Energy-Based Model (EBM). Our formulation posits that observed images are the sum of unobserved latent variables passed through the generator network and a residual random variable that spans the gap between the generator output and the image manifold. One can then define an EBM that includes the generator as part of its forward pass, which we call the Hat EBM. The model can be trained without inferring the latent variables of the observed data or calculating the generator Jacobian determinant. This enables explicit probabilistic modeling of the output distribution of any type of generator network. Experiments show strong performance of the proposed method on (1) unconditional ImageNet synthesis at 128x128 resolution, (2) refining the output of existing generators, and (3) learning EBMs that incorporate non-probabilistic generators. Code and pretrained models to reproduce our results are available at https://github.com/point0bar1/hat-ebm.