论文标题
利率延伸功能的神经估计,并应用于操作源编码
Neural Estimation of the Rate-Distortion Function With Applications to Operational Source Coding
论文作者
论文摘要
设计有损数据压缩方案的一个基本问题是,与速率延伸函数相比,如何做得如何,该功能描述了已知的有损压缩的理论限制。由深神经网络(DNN)压缩机在大型现实世界数据上的经验成功所激发的动机,我们研究了估计此类数据上的速率延伸功能的方法,这将允许将DNN压缩机与最佳性进行比较。尽管人们可以使用数据的经验分布并应用Blahut-Arimoto算法,但是当数据集较大且高维时,例如现代图像数据集的情况,这种方法会提出几种计算挑战和不准确性。取而代之的是,我们重新构建了速率延伸目标,并使用神经网络解决了所得的功能优化问题。我们在流行的图像数据集上应用了所得速率估计器,称为书呆子,并提供了书呆子可以准确估计速率延伸函数的证据。使用我们的估计,我们表明DNN压缩机可实现的速率延伸位于现实世界数据集的速率延伸函数的几个位之内。此外,书呆子还提供了对达到速率达到通道的速率以及其输出边缘的样本的访问。因此,使用反向通道编码中的最新结果,我们描述了如何使用书呆子来构建操作单发损耗的压缩方案,并保证可实现的速率和失真。实验结果证明了DNN压缩机的竞争性能。
A fundamental question in designing lossy data compression schemes is how well one can do in comparison with the rate-distortion function, which describes the known theoretical limits of lossy compression. Motivated by the empirical success of deep neural network (DNN) compressors on large, real-world data, we investigate methods to estimate the rate-distortion function on such data, which would allow comparison of DNN compressors with optimality. While one could use the empirical distribution of the data and apply the Blahut-Arimoto algorithm, this approach presents several computational challenges and inaccuracies when the datasets are large and high-dimensional, such as the case of modern image datasets. Instead, we re-formulate the rate-distortion objective, and solve the resulting functional optimization problem using neural networks. We apply the resulting rate-distortion estimator, called NERD, on popular image datasets, and provide evidence that NERD can accurately estimate the rate-distortion function. Using our estimate, we show that the rate-distortion achievable by DNN compressors are within several bits of the rate-distortion function for real-world datasets. Additionally, NERD provides access to the rate-distortion achieving channel, as well as samples from its output marginal. Therefore, using recent results in reverse channel coding, we describe how NERD can be used to construct an operational one-shot lossy compression scheme with guarantees on the achievable rate and distortion. Experimental results demonstrate competitive performance with DNN compressors.