论文标题

一个统计框架,用于研究信号重建方法的最佳性

A Statistical Framework to Investigate the Optimality of Signal-Reconstruction Methods

论文作者

Bohra, Pakshal, Pla, Pol del Aguila, Giovannelli, Jean-François, Unser, Michael

论文摘要

我们提出了一个统计框架,以基于线性反问题的重建算法的性能,特别是基于神经网络的方法,需要大量的训练数据。我们将合成信号作为稀疏随机过程的实现,这使得它们与促进性促进技术的理想匹配。我们得出了Gibbs采样方案,以计算使用Laplace,Student t和Bernoulli-Laplace创新的过程的最小均方误差估计器。这些允许我们的框架为任何给定的重建方法提供定量衡量最佳程度(在均方面的意义上)。我们通过基准在反卷积和傅立叶采样的上下文中进行直接非线性重建的一些众所周知的变异方法和卷积神经网络体系结构的性能来展示我们的框架。我们的实验结果支持这样的理解:尽管这些神经网络的表现优于变异方法,并且在许多情况下取得了近乎最佳的结果,但它们的性能对与重尾分布相关的信号严重恶化。

We present a statistical framework to benchmark the performance of reconstruction algorithms for linear inverse problems, in particular, neural-network-based methods that require large quantities of training data. We generate synthetic signals as realizations of sparse stochastic processes, which makes them ideally matched to variational sparsity-promoting techniques. We derive Gibbs sampling schemes to compute the minimum mean-square error estimators for processes with Laplace, Student's t, and Bernoulli-Laplace innovations. These allow our framework to provide quantitative measures of the degree of optimality (in the mean-square-error sense) for any given reconstruction method. We showcase our framework by benchmarking the performance of some well-known variational methods and convolutional neural network architectures that perform direct nonlinear reconstructions in the context of deconvolution and Fourier sampling. Our experimental results support the understanding that, while these neural networks outperform the variational methods and achieve near-optimal results in many settings, their performance deteriorates severely for signals associated with heavy-tailed distributions.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源