论文标题
深层整流网络对平滑度类的近似
Approximation of Smoothness Classes by Deep Rectifier Networks
论文作者
论文摘要
我们考虑在besov空间中用于功能的稀疏连接的深层整流线性单元(relu)和整流功率单元(repu)神经网络的近似速率$ b^α__{q}(l^p)$在一般域中,在任意尺寸$ d $中。我们表明,\ Alert {Deep Rectifier}网络具有固定的激活函数的最佳或接近besov空间功能的最佳近似速率$ b^α_τ(l^τ)$ $ 1/τ=α/τ=α/D+1/p $ for \ emph {intunary} splace nesut}平滑级$ $ $ $ $ 0> 0> 0> 0。使用插值理论,这意味着临界线以上或上方的整个平滑度类别(接近)通过深度relu/repu网络最佳地近似。
We consider approximation rates of sparsely connected deep rectified linear unit (ReLU) and rectified power unit (RePU) neural networks for functions in Besov spaces $B^α_{q}(L^p)$ in arbitrary dimension $d$, on general domains. We show that \alert{deep rectifier} networks with a fixed activation function attain optimal or near to optimal approximation rates for functions in the Besov space $B^α_τ(L^τ)$ on the critical embedding line $1/τ=α/d+1/p$ for \emph{arbitrary} smoothness order $α>0$. Using interpolation theory, this implies that the entire range of smoothness classes at or above the critical line is (near to) optimally approximated by deep ReLU/RePU networks.