4.5 Article Proceedings Paper

Compressive Neural Representations of Volumetric Scalar Fields

期刊

COMPUTER GRAPHICS FORUM
卷 40, 期 3, 页码 135-146

出版社

WILEY
DOI: 10.1111/cgf.14295

关键词

-

资金

  1. National Science Foundation (NSF) [IIS-2007444, IIS-2006710]
  2. U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research [DE-SC-0019039]

向作者/读者索取更多资源

The proposed approach presents a method for compressing volumetric scalar fields using implicit neural representations, achieving highly compact representations by quantizing network weights and outperforming state-of-the-art compression methods. The conceptual simplicity of the approach allows for multiple benefits such as support for time-varying scalar fields and random-access field evaluation.
We present an approach for compressing volumetric scalar fields using implicit neural representations. Our approach represents a scalar field as a learned function, wherein a neural network maps a point in the domain to an output scalar value. By setting the number of weights of the neural network to be smaller than the input size, we achieve compressed representations of scalar fields, thus framing compression as a type of function approximation. Combined with carefully quantizing network weights, we show that this approach yields highly compact representations that outperform state-of-the-art volume compression approaches. The conceptual simplicity of our approach enables a number of benefits, such as support for time-varying scalar fields, optimizing to preserve spatial gradients, and random-access field evaluation. We study the impact of network design choices on compression performance, highlighting how simple network architectures are effective for a broad range of volumes.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据