论文标题

Pref:紧凑神经表示的嵌入字段

PREF: Phasorial Embedding Fields for Compact Neural Representations

论文作者

Huang, Binbin, Yan, Xinhao, Chen, Anpei, Gao, Shenghua, Yu, Jingyi

论文摘要

我们提出了一个有效的基于频率的神经表示,称为PREF:浅层MLP用相量的体积增强,涵盖了显着的边界光谱,而不是以前的傅立叶特征映射或位置编码。核心是我们的紧凑型3D相量体积,其中频率沿2D平面均匀分布,并沿1D轴扩张。为此,我们开发了一个量身定制的高效傅立叶变换,将快速的傅立叶变换和局部插值结合在一起,以加速幼稚的傅立叶映射。我们还介绍了一个基于频率学习的脱玻璃正规化程序。通过这些方式,我们的PREF降低了基于频率的表示中昂贵的MLP,从而显着缩小了IT与其他混合表示之间的效率差距,并提高了其可解释性。全面的实验表明,我们的PEEF能够捕获高频细节,同时保持紧凑和健壮,包括2D图像概括,3D签名的距离函数回归和5D神经辐射场重建。

We present an efficient frequency-based neural representation termed PREF: a shallow MLP augmented with a phasor volume that covers significant border spectra than previous Fourier feature mapping or Positional Encoding. At the core is our compact 3D phasor volume where frequencies distribute uniformly along a 2D plane and dilate along a 1D axis. To this end, we develop a tailored and efficient Fourier transform that combines both Fast Fourier transform and local interpolation to accelerate naïve Fourier mapping. We also introduce a Parsvel regularizer that stables frequency-based learning. In these ways, Our PREF reduces the costly MLP in the frequency-based representation, thereby significantly closing the efficiency gap between it and other hybrid representations, and improving its interpretability. Comprehensive experiments demonstrate that our PREF is able to capture high-frequency details while remaining compact and robust, including 2D image generalization, 3D signed distance function regression and 5D neural radiance field reconstruction.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源