论文标题
无梯度的内核斯坦因差异
Gradient-Free Kernel Stein Discrepancy
论文作者
论文摘要
Stein差异已成为一种强大的统计工具,应用于基本统计问题,包括参数推理,拟合优点测试和采样。规范的Stein差异需要计算统计模型的导数,并且作为回报提供了收敛检测和控制的理论保证。但是,对于复杂的统计模型,衍生物的稳定数值计算可能需要定制的算法开发,并使Stein差异不切实际。本文着重于使用Stein差异的后近似值,并引入了无梯度的非经典Stein差异的集合,这意味着不需要统计模型的衍生物。建立了足够的收敛检测和控制条件,并提出了对抽样和变异推断的应用。
Stein discrepancies have emerged as a powerful statistical tool, being applied to fundamental statistical problems including parameter inference, goodness-of-fit testing, and sampling. The canonical Stein discrepancies require the derivatives of a statistical model to be computed, and in return provide theoretical guarantees of convergence detection and control. However, for complex statistical models, the stable numerical computation of derivatives can require bespoke algorithmic development and render Stein discrepancies impractical. This paper focuses on posterior approximation using Stein discrepancies, and introduces a collection of non-canonical Stein discrepancies that are gradient free, meaning that derivatives of the statistical model are not required. Sufficient conditions for convergence detection and control are established, and applications to sampling and variational inference are presented.