论文标题

切成薄片的瓦斯坦变量推断

Sliced Wasserstein Variational Inference

论文作者

Yi, Mingxuan, Liu, Song

论文摘要

通过最小化kullback-leibler(kl)差异,变化推断近似于非差异分布。尽管这种差异对于计算有效,并且已在应用中广泛使用,但它具有一些不合理的属性。例如,它不是一个适当的度量标准,即,它是非对称的,并且不能保留三角形不等式。另一方面,最近的最佳运输距离表现出了比KL差异的一些优势。在这些优势的帮助下,我们提出了一种新的变分推断方法,通过最大程度地减少切片的瓦斯坦距离,这是由最佳运输产生的有效度量。仅通过运行MCMC而不能解决任何优化问题,就可以简单地近似切片的Wasserstein距离。我们的近似值也不需要变异分布的可触发密度函数,因此诸如神经网络之类的发电机可以摊销近似族。此外,我们提供了方法的理论特性分析。说明了有关合成和真实数据的实验,以显示该方法的性能。

Variational Inference approximates an unnormalized distribution via the minimization of Kullback-Leibler (KL) divergence. Although this divergence is efficient for computation and has been widely used in applications, it suffers from some unreasonable properties. For example, it is not a proper metric, i.e., it is non-symmetric and does not preserve the triangle inequality. On the other hand, optimal transport distances recently have shown some advantages over KL divergence. With the help of these advantages, we propose a new variational inference method by minimizing sliced Wasserstein distance, a valid metric arising from optimal transport. This sliced Wasserstein distance can be approximated simply by running MCMC but without solving any optimization problem. Our approximation also does not require a tractable density function of variational distributions so that approximating families can be amortized by generators like neural networks. Furthermore, we provide an analysis of the theoretical properties of our method. Experiments on synthetic and real data are illustrated to show the performance of the proposed method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源