论文标题

使用深神经网络计算抗衍生物

Computing Anti-Derivatives using Deep Neural Networks

论文作者

Chakraborty, D., Gopalakrishnan, S.

论文摘要

本文提出了一种新型算法,以获得使用深神经网络结构的函数的闭合形式的抗衍生物。过去,数学家已经开发了几种数值技术来近似确定的积分的值,但是原始或无限积分通常是非元素的。当集成体中有几个参数并且获得的积分是这些参数的函数时,必须需要抗衍生物。没有理论方法可以为任何给定的功能执行此操作。解决此问题的一些现有方法主要是基于曲线拟合或无限序列的近似,然后在理论上集成。对于高度非线性函数,曲线拟合近似值不准确,并且需要针对每个问题采用不同的方法。另一方面,无限的串联方法没有给出封闭形式的解决方案,并且它们的截短形式通常不准确。我们声称,使用所有积分的方法,我们的算法可以将抗衍生物近似于任何必需的准确性。我们已经使用该算法来获得多种功能的抗衍生物,包括非质量和振荡积分。本文还显示了我们方法的应用,以获取椭圆形积分,费米 - 迪拉克积分和累积分布函数的闭合形式表达式,并减少盖金方法的计算时间用于微分方程。

This paper presents a novel algorithm to obtain the closed-form anti-derivative of a function using Deep Neural Network architecture. In the past, mathematicians have developed several numerical techniques to approximate the values of definite integrals, but primitives or indefinite integrals are often non-elementary. Anti-derivatives are necessarily required when there are several parameters in an integrand and the integral obtained is a function of those parameters. There is no theoretical method that can do this for any given function. Some existing ways to get around this are primarily based on either curve fitting or infinite series approximation of the integrand, which is then integrated theoretically. Curve fitting approximations are inaccurate for highly non-linear functions and require a different approach for every problem. On the other hand, the infinite series approach does not give a closed-form solution, and their truncated forms are often inaccurate. We claim that using a single method for all integrals, our algorithm can approximate anti-derivatives to any required accuracy. We have used this algorithm to obtain the anti-derivatives of several functions, including non-elementary and oscillatory integrals. This paper also shows the applications of our method to get the closed-form expressions of elliptic integrals, Fermi-Dirac integrals, and cumulative distribution functions and decrease the computation time of the Galerkin method for differential equations.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源