Login | Register

Turbulence Modelling with Deep Neural Networks

Title:

Turbulence Modelling with Deep Neural Networks

Bao, Jie (2022) Turbulence Modelling with Deep Neural Networks. Masters thesis, Concordia University.

[thumbnail of BAO_MASc_S2023.pdf]
Preview
Text (application/pdf)
BAO_MASc_S2023.pdf - Accepted Version
Available under License Spectrum Terms of Access.
7MB

Abstract

High-Fidelity computational fluid dynamics simulations, such as large eddy simulation and direct numerical simulation (DNS), are computationally expensive. For this reason, the Reynolds-averaged Navier-Stokes (RANS) equations are more popular for industrial applications. They are obtained by deriving time-averaged properties, removing the need to resolve small-scale unsteady turbulent fluctuations. However, RANS requires a closure model to relate unknown Reynolds stresses to the time-averaged velocities. Traditional turbulence closure models are prone to inaccuracies, particularly for separated and transitional flows, due to the chaotic and anisotropic nature of turbulence. In this study, a data-driven approach is used to model turbulence leveraging high-fidelity data from DNS. A canonical turbulent channel case is considered at three different Reynolds numbers and compared with reference results. A deep neural network is trained on these three datasets, and then used to predict two intermediate Reynolds numbers. Comparison with parallel DNS simulations demonstrates agreement from 70% to 98% R2 score for predictions of turbulent kinetic energy production and dissipation, depending on the chosen training and testing datasets. A second study is then performed on a NACA 0012 airfoil at a Reynolds number of 50,000 at angles of attack ranging from 4 to 12 degrees. A deep neural network was trained on a limited set of these angles of attack, and used to predict turbulent kinetic energy production and dissipation. Comparison with the testing dataset showed good agreement at lower angles of attack, with limited agreement at high angles of attack beyond stall. Based on these results, we can conclude that deep neural networks can be trained to accurately predict turbulent kinetic energy production and dissipation for wall-bounded turbulent flows, but there are some limitations for massively separated flows. Future work will focus on directly incorporating these results in a RANS turbulence modelling framework.

Divisions:Concordia University > Gina Cody School of Engineering and Computer Science > Mechanical, Industrial and Aerospace Engineering
Item Type:Thesis (Masters)
Authors:Bao, Jie
Institution:Concordia University
Degree Name:M.A. Sc.
Program:Mechanical Engineering
Date:14 December 2022
Thesis Supervisor(s):Vermeire, Brian
ID Code:992202
Deposited By: Jie Bao
Deposited On:21 Jun 2023 14:29
Last Modified:21 Jun 2023 14:29

References:

References
[1] International Civil Aviation Organisation. “The World of Air Transport in 2019”. In:
(2019). url: https://www.icao.int/annual-report-2019/Pages/the-world-of-airtransport-
in-2019.aspx.
[2] International Air Transport Association. “IATA/Tourism Economics Air Passenger Forecast,
March 2022”. In: (2022). url: https://www.iata.org/en/pressroom/2022-releases/
2022-03-01-01/.
[3] Corinne Le Quéré et al. Supplementary data to: Le Quéré et al (2020), Temporary reduction
in daily global CO2 emissions during the COVID-19 forced confinement. 2020. url: https:
//www.icos-cp.eu/gcp-covid19.
[4] John A. Ekaterinaris. “High-order accurate, low numerical diffusion methods for aerodynamics”.
In: Progress in Aerospace Sciences 41.3 (2005), pp. 192–300. issn: 0376-0421. url: https:
//www.sciencedirect.com/science/article/pii/S0376042105000473.
[5] Stuart Russell and Peter Norvig. Artificial Intelligence: A Modern Approach. 3rd. USA:
Prentice Hall Press, 2009. isbn: 0136042597.
[6] Deep Blue computer beats world chess champion. 1996. url: https://www.theguardian.
com/sport/2021/feb/12/deep-blue-computer-beats-kasparov-chess-1996.
[7] Sam Adams et al. “Mapping the Landscape of Human-Level Artificial General Intelligence”.
In: AI Magazine 33 (Mar. 2012), pp. 25–42.
[8] Hubert L. Dreyfus. What Computers Still Can’t Do: A Critique of Artificial Reason.
Cambridge, MA, USA: MIT Press, 1992. isbn: 0262041340.
[9] Ragnar Fjelland. “Why general artificial intelligence will not be realized”. In: Humanities
and Social Sciences Communications 7 (June 2020), p. 10.
61
[10] Ian Goodfellow, Yoshua Bengio and Aaron Courville. Deep Learning. http : / / www .
deeplearningbook.org. MIT Press, 2016.
[11] Warren Mcculloch and Walter Pitts. “A Logical Calculus of Ideas Immanent in Nervous
Activity”. In: Bulletin of Mathematical Biophysics 5 (1943), pp. 127–147.
[12] Geoffrey E. Hinton, Simon Osindero and Yee-Whye Teh. “A Fast Learning Algorithm for
Deep Belief Nets”. In: Neural Computation 18.7 (2006), pp. 1527–1554. issn: 0899-7667.
eprint: https://direct.mit.edu/neco/article-pdf/18/7/1527/816558/neco.2006.
18.7.1527.pdf.
[13] Y. Taigman et al. “DeepFace: Closing the Gap to Human-Level Performance in Face
Verification”. In: (2014), pp. 1701–1708.
[14] D. Stuart Pope Thomas F. Brooks and Michael A. Marcolini. Airfoil Self-Noise Data Set.
2014. url: https://archive.ics.uci.edu/ml/datasets/Airfoil+Self-Noise#.
[15] Richard S. Sutton and Andrew G. Barto. Reinforcement Learning: An Introduction. Second.
The MIT Press, 2018. url: http://incompleteideas.net/book/the-book-2nd.html.
[16] Volodymyr Mnih et al. “Playing Atari with Deep Reinforcement Learning”. In: CoRR
abs/1312.5602 (2013). arXiv: 1312.5602. url: http://arxiv.org/abs/1312.5602.
[17] Xindong Wu et al. “Top 10 algorithms in data mining”. In: Knowledge and Information
Systems 14 (2007).
[18] Daniel P. Raymer. Aircraft Design: A Conceptual Approach. 5th. AIAA Education Series.
[19] Nicolas Bons and Joaquim Martins. “Aerostructural Wing Design Exploration with
Multidisciplinary Design Optimization”. In: (2020).
[20] Aerospace and AI - Bringing together Montreal’s distinctive strengths. 2019. url: https:
/ / www . bcg . com / en - ca / aerospace - and - ai - bringing - together - montreals -
distinctive-strengths.
[21] IVADO and Bombardier Sign a Strategic Agreement to Develop Innovative Digital Solutions.
2019. url: https://ivado.ca/en/2019/02/20/ivado- and- bombardier- sign- astrategic-
agreement-to-develop-innovative-digital-solutions/.
[22] SCALE AI. “Predictive Analytics for Aviation”. In: (2021). url: https://www.scaleai.
ca/funded-projects/predictive-analytics-for-aviation/.
62
[23] Emmanuel J. Candès et al. “Robust Principal Component Analysis?” In: J. ACM 58.3
(2011). issn: 0004-5411.
[24] Krithika Manohar et al. “Predicting shim gaps in aircraft assembly with machine learning
and sparse sensing”. In: Journal of Manufacturing Systems 48 (2018), pp. 87–95. issn: 0278-
6125. url: https://www.sciencedirect.com/science/article/pii/S0278612518300116.
[25] K. Manohar et al. “Data-Driven Sparse Sensor Placement for Reconstruction: Demonstrating
the Benefits of Exploiting Known Patterns”. In: IEEE Control Systems Magazine 38.3
(2018), pp. 63–86.
[26] D. Bruneo and F. De Vita. “On the Use of LSTM Networks for Predictive Maintenance in
Smart Industries”. In: (2019), pp. 241–248.
[27] A. Nanduri and L. Sherry. “Anomaly detection in aircraft data using Recurrent Neural
Networks (RNN)”. In: (2016), pp. 5C2–1–5C2–8.
[28] Liang Xu et al. “Guided Wave-Convolutional Neural Network Based Fatigue Crack
Diagnosis of Aircraft Structures”. In: Sensors 19.16 (2019). issn: 1424-8220.
[29] Paul Garnier et al. A review on Deep Reinforcement Learning for Fluid Mechanics. 2021.
arXiv: 1908.04127 [physics.comp-ph].
[30] J. Rabault et al. “Deep reinforcement learning in fluid mechanics: A promising method for
both active flow control and shape optimization”. In: Journal of Hydrodynamics 32 (2020),
pp. 234–246.
[31] Tony Hey, Stewart Tansley and Kristin Tolle. The Fourth Paradigm: Data-Intensive
Scientific Discovery. Microsoft Research, 2009. isbn: 978-0-9825442-0-4. url: https :
/ / www . microsoft . com / en - us / research / publication / fourth - paradigm - data -
intensive-scientific-discovery/.
[32] B.C. Vermeire, F.D. Witherden and P.E. Vincent. “On the utility of GPU accelerated
high-order methods for unsteady flow simulations: A comparison with industry-standard
tools”. In: Journal of Computational Physics 334 (2017), pp. 497–521. issn: 0021-9991.
[33] Martín Abadi et al. TensorFlow: Large-Scale Machine Learning on Heterogeneous Systems.
Software available from tensorflow.org. 2015. url: https://www.tensorflow.org/.
63
[34] David Donoho. “50 Years of Data Science”. In: Journal of Computational and Graphical
Statistics 26.4 (2017), pp. 745–766. eprint: https://doi.org/10.1080/10618600.2017.
1384734.
[35] Masataka Gamahara and Yuji Hattori. “Searching for turbulence models by artificial
neural network”. In: Physical Review Fluids 2 (5 2017), p. 054604.
[36] Julia Ling, Reese Jones and Jeremy Templeton. “Machine learning strategies for systems
with invariance properties”. In: Journal of Computational Physics 318 (2016).
[37] S. B. Pope. “A more general effective-viscosity hypothesis”. In: Journal of Fluid Mechanics
72.2 (1975), 331–340.
[38] Salar Taghizadeh, Freddie D Witherden and Sharath S Girimaji. “Turbulence closure
modeling with data-driven techniques: physical compatibility and consistency considerations”.
In: New Journal of Physics 22.9 (2020), p. 093023. issn: 1367-2630.
[39] F.D. Witherden, A.M. Farrington and P.E. Vincent. “PyFR: An open source framework
for solving advection–diffusion type problems on streaming architectures using the flux
reconstruction approach”. In: Computer Physics Communications 185.11 (2014), pp. 3028–3040.
issn: 0010-4655. url: https : / / www . sciencedirect . com / science / article / pii /
S0010465514002549.
[40] John Kim, Parviz Moin and Robert Moser. “The Turbulence Statistics in Fully Developed
Channel Flow at Low Reynolds Number”. In: Journal of Fluid Mechanics 177 (May 1987).
[41] Ivette Rodriguez et al. “Direct numerical simulation of a NACA0012 in full stall”. In:
International Journal of Heat and Fluid Flow 43 (2013), pp. 194–203.
[42] P.A. Davidson. Turbulence: An Introduction for Scientists and Engineers. OUP Oxford,
2004. isbn: 9780191589850. url: https://books.google.ca/books?id=e\_tzBAAAQBAJ.
[43] B.C. Vermeire, Carlos Pereira and Hamid R. Karbasian. Computational Fluid Dynamics:
An Open-Source Approach. 2021.
[44] H.T.Huynh. “A Flux Reconstruction Approach to High-order Schemes Including Discontinuous
Galerkin Methods”. In: In 18th AIAA Computational Fluid Dynamics Conference (2007),
p. 4079.
[45] Uriel Frisch. Turbulence: The Legacy of A.N. Kolmogorov. Nov. 1995. isbn: 9780521451031.
64
[46] Osborne Reynolds. “IV. On the dynamical theory of incompressible viscous fluids and
the determination of the criterion”. In: Philosophical Transactions of the Royal Society of
London. (A.) 186 (1895), pp. 123–164.
[47] P. Y. Chou. “On velocity correlations and the solutions of the equations of turbulent
fluctuation”. In: Quarterly of Applied Mathematics 3.1 (1945), pp. 38–54.
[48] M.A. Leschziner and F.-S. Lien. “Numerical Aspects of Applying Second-Moment Closure
to Complex Flows”. In: Closure Strategies for Turbulent and Transitional Flows. Ed. by
B. E. Launder and N. D.Editors Sandham. Cambridge University Press, 2002, 153–187.
[49] C G Speziale. “Analytical Methods for the Development of Reynolds-Stress Closures in
Turbulence”. In: Annual Review of Fluid Mechanics 23.1 (1991), pp. 107–157. eprint:
https://doi.org/10.1146/annurev.fl.23.010191.000543.
[50] Stephen B Pope. Turbulent flows. Cambridge: Cambridge Univ. Press, 2011. url: https:
//cds.cern.ch/record/1346971.
[51] T. B. Gatski and C. G. Speziale. “On explicit algebraic stress models for complex turbulent
flows”. In: Journal of Fluid Mechanics 254 (1993), 59–78.
[52] R. A. Fisher. “The Use of Multiple Measurements in Taxonomic Problems”. In: Annals of
Eugenics 7.7 (1936), pp. 179–188.
[53] Hugo Larochelle. “Études de techniques d’appentissage non-supervisé pour l’amélioration de
l’entraînement supervisé de modèles connexionnistes”. PhD thesis. Université de Montréal,
2009.
[54] B. Liu and Y. Liu. “Image Classification for Dogs and Cats”. In: (2014).
[55] Leslie N. Smith. Cyclical Learning Rates for Training Neural Networks. 2015. url: https:
//arxiv.org/abs/1506.01186.
[56] Dheeru Dua and Casey Graff. UCI Machine Learning Repository. 2017. url: http :
//archive.ics.uci.edu/ml.
[57] Kaiming He et al. Delving Deep into Rectifiers: Surpassing Human-Level Performance on
ImageNet Classification. 2015. url: https://arxiv.org/abs/1502.01852.
[58] Abien Fred Agarap. “Deep learning using rectified linear units (relu)”. In: arXiv preprint
arXiv:1803.08375 (2018).
[59] Diederik P. Kingma and Jimmy Ba. Adam: A Method for Stochastic Optimization. 2014.
65
[60] Nitish Srivastava et al. “Dropout: A Simple Way to Prevent Neural Networks from
Overfitting”. In: J. Mach. Learn. Res. 15.1 (2014), 1929–1958. issn: 1532-4435.
[61] P.A. Davidson. Turbulence: An Introduction for Scientists and Engineers. OUP Oxford,
2004. isbn: 9780191589850. url: https://books.google.ca/books?id=e\_tzBAAAQBAJ.
[62] Rui Wang et al. “Towards Physics-Informed Deep Learning for Turbulent Flow Prediction”.
In: (2020).
[63] Fangying Song and George Em Karniadakis. A Universal Fractional Model of Wall-
Turbulence. 2018.
All items in Spectrum are protected by copyright, with all rights reserved. The use of items is governed by Spectrum's terms of access.

Repository Staff Only: item control page

Downloads per month over past year

Research related to the current document (at the CORE website)
- Research related to the current document (at the CORE website)
Back to top Back to top