论文标题

卷卷神经网络和长期基于短期记忆的长期降低阶替代替代的湍流流量

Convolutional neural network and long short-term memory based reduced order surrogate for minimal turbulent channel flow

论文作者

Nakamura, Taichi, Fukami, Kai, Hasegawa, Kazuto, Nabae, Yusuke, Fukagata, Koji

论文摘要

我们研究了基于机器学习的还原订单模型(ML-ROM)对三维复杂流的适用性。例如,我们考虑在最小域中的$re_τ= 110 $的摩擦雷诺数处的湍流通道流,该域可以维持湍流的相干结构。培训数据集由直接数值模拟(DNS)制备。当前的ML-ROM是通过组合三维卷积神经网络自动编码器(CNN-AE)和长期短期记忆(LSTM)来构建的。 CNN-AE致力于将高维流场映射到低维的潜在空间。然后,利用LSTM预测CNN-AE获得的潜在向量的时间演变。 CNN-AE和LSTM的组合可以通过仅整合低维潜在动力学的时间演化来代表流场的时空高维动力学。本ML-ROM复制的湍流场在时间安装中显示了与参考DNS数据的统计一致性,这也可以通过基于轨道的分析找到。还研究了域中包含的涡流结构种群的影响以及用于ML-ROM性能的时间间隔。在我们演示结束时,讨论了当前ML-ROM的潜在和局限性。

We investigate the applicability of machine learning based reduced order model (ML-ROM) to three-dimensional complex flows. As an example, we consider a turbulent channel flow at the friction Reynolds number of $Re_τ=110$ in a minimum domain which can maintain coherent structures of turbulence. Training data set are prepared by direct numerical simulation (DNS). The present ML-ROM is constructed by combining a three-dimensional convolutional neural network autoencoder (CNN-AE) and a long short-term memory (LSTM). The CNN-AE works to map high-dimensional flow fields into a low-dimensional latent space. The LSTM is then utilized to predict a temporal evolution of the latent vectors obtained by the CNN-AE. The combination of CNN-AE and LSTM can represent the spatio-temporal high-dimensional dynamics of flow fields by only integrating the temporal evolution of the low-dimensional latent dynamics. The turbulent flow fields reproduced by the present ML-ROM show statistical agreement with the reference DNS data in time-ensemble sense, which can also be found through an orbit-based analysis. Influences of the population of vortical structures contained in the domain and the time interval used for temporal prediction on the ML- ROM performance are also investigated. The potential and limitation of the present ML-ROM for turbulence analysis are discussed at the end of our presentation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源