论文标题

连续时间序列压缩的潜在离散化

Latent Discretization for Continuous-time Sequence Compression

论文作者

Chen, Ricky T. Q., Le, Matthew, Muckley, Matthew, Nickel, Maximilian, Ullrich, Karen

论文摘要

神经压缩提供了一种域形不足的方法,可通过深层生成模型创建用于有损或无损压缩的编解码器。但是,对于序列压缩,大多数深序模型的成本可以随序列长度而不是序列复杂性而扩展。在这项工作中,我们将数据序列视为基础连续时间过程的观察结果,并学习如何有效离散,同时保留有关完整序列的信息。由于将顺序信息从其时间离散化中取消,我们的方法允许更高的压缩率和较小的计算复杂性。此外,连续时间方法自然可以使我们在不同的时间间隔中解码。我们从经验上验证了涉及视频和运动捕获序列压缩的多个领域的方法,这表明我们的方法可以通过学习如何离散来自动减少比特率。

Neural compression offers a domain-agnostic approach to creating codecs for lossy or lossless compression via deep generative models. For sequence compression, however, most deep sequence models have costs that scale with the sequence length rather than the sequence complexity. In this work, we instead treat data sequences as observations from an underlying continuous-time process and learn how to efficiently discretize while retaining information about the full sequence. As a consequence of decoupling sequential information from its temporal discretization, our approach allows for greater compression rates and smaller computational complexity. Moreover, the continuous-time approach naturally allows us to decode at different time intervals. We empirically verify our approach on multiple domains involving compression of video and motion capture sequences, showing that our approaches can automatically achieve reductions in bit rates by learning how to discretize.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源