论文标题

深度神经网络近似平滑而稀疏的功能而没有饱和

Approximation smooth and sparse functions by deep neural networks without saturation

论文作者

Liu, Xia

论文摘要

构建用于功能近似的神经网络是近似理论中的经典且长期存在的主题。在本文中,我们旨在构建具有三层隐藏层的深神经网络(简短的深网),以近似平滑而稀疏的功能。特别是,我们证明,构造的深网可以在近似平滑和稀疏函数的情况下达到最佳近似率,并具有可控的自由参数。由于描述近似瓶颈的饱和度是建设性神经网络的一个无法克服的问题,因此我们还证明,只有一个隐藏的层只能避免饱和度。获得的结果是深网的优势,并为深度学习提供了理论解释。

Constructing neural networks for function approximation is a classical and longstanding topic in approximation theory. In this paper, we aim at constructing deep neural networks (deep nets for short) with three hidden layers to approximate smooth and sparse functions. In particular, we prove that the constructed deep nets can reach the optimal approximation rate in approximating both smooth and sparse functions with controllable magnitude of free parameters. Since the saturation that describes the bottleneck of approximate is an insurmountable problem of constructive neural networks, we also prove that deepening the neural network with only one more hidden layer can avoid the saturation. The obtained results underlie advantages of deep nets and provide theoretical explanations for deep learning.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源