You are on page 1of 68

Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.

ch) for single use only

100 Years of CFD


•SPECIAL EDITION•

OCTOBER 2022
THE INTERNATIONAL MAGAZINE FOR ENGINEERING DESIGNERS & ANALYSTS FROM NAFEMS
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Editor
David Quinn
david.quinn@nafems.org

Copy Editor
Sinothile Baloyi
sinothile.baloyi@nafems.org

Design/Production
d2 print
info@d2print.com
1922 1932 1942
GOVERNING 1822
EQUATIONS/ 1845 1877 1895 1908 1925 1941
Advertising
TURBULENCE
Paul Steward
paul.steward@nafems.org FINITE DIFFERENCE
1908 1910 1922 1925 1928 1933 1940 1941 1
NUMERICS

Subscriptions 1943
Karen Kelly FINITE ELEMENT
karen.kelly@nafems.org

Subscription Pricing 1943

Benchmark magazine is delivered


HARDWARE
free to all NAFEMS members.
You can also subscribe to the
magazine for either 1 or 3 years.
CFD - A Timeline
1 Year Subscription (4 issues): £40
3 Year Subscription (12 issues): £100
Visit nafe.ms/subs to subscribe 24 From the 17th Century to the Present Day

Tang, the Space Geek, and CFD


6
Richardson’s Forecast: The Dream and the Fantasy
8
ISSN 0951 6859

Publisher The Future of CFD: The Academic Opinion

NAFEMS Ltd.
17
PO Box 20342
Hamilton
ML3 3BW NAFEMS World Congress 2023
32
t +44(0)1355 225688
e info@nafems.org

CFD in Undergraduate Curricula


34
Vendor Viewpoint
39
Errors & Omissions: While every care is taken
in compiling benchmark, neither the Editors,
nor NAFEMS, can be held responsible for the
consequences of any errors in, or omissions
Regulars
from, the contents. The views expressed by 4 What’s On
contributors are their own and all information
is accepted in good faith as being correct at 15 E-Learning
the time of going to press. The Editors will not
accept any advertisement considered by them
to be misleading or otherwise unsuitable for

1
inclusion in benchmark, however, the
presence of any advertisement should not be
considered to convey or imply any form of Cover image © Stephen Conlin, 1986. All Rights Reserved.
commendation by NAFEMS. Based on advice from Prof. John Byrne, TCD. Reproduced under license.

Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

E N C
C H
H
BBE N
M
T E A

EN C H
A V
I E W
A V I E W
F R O
F R
M
O
Y O U
R

M Y O U R
E D I T O R I A L T E A M
E D I
T O R I A
L

A V I E W F R O M Y O U R E D I T O R I A L T E A M
BENCHMARK@NAFEMS.ORG

n this special edition of Benchmark, we cast a reflective gaze on the evolution of Computational
Fluid Dynamics (CFD) over the last 100 years and try to imagine what the future might hold for
the discipline. In truth, the foundations are much older than a hundred years. They stretch back
to Isaac Newton in the 17th century and George Gabriel Stokes in the 19th century, and that’s
mentioning only a few of the many eminent mathematicians, engineers, and scientists who have
contributed to the making of what we today call CFD.

Of course, we cannot talk about CFD in the modern era without talking about Lewis Fry Richardson
(LFR) and his seminal work ‘Weather Prediction by Numerical Process’. Published 100 years ago, in
1922, LFR’s work laid the foundations of the modern weather forecast.

LFR was a fascinating character: a Quaker and a pacificist, in 1916 during the first world war, he joined
an ambulance unit on the Western Front. Undeterred by the fighting around him, he undertook his
calculations from muddy trenches and cold, wet billets in his spare time. At one point he decided to
attempt to numerically ‘hindcast’ the weather for an area of western Europe using the published
weather data for that particular date. His now famous work soon demonstrated the enormity of the
task, and he realised that he would need thousands of computers (people!) to calculate the one-day
weather forecast.

In this issue, Peter Lynch’s article reviews LFR’s work in much more detail and shows (with the benefit
of a certain amount of hindsight) how good LFR’s predictions were.

To illustrate the evolution of CFD during the last century, we have constructed a timeline using forward
and backward referencing from various reviews to develop a chronology for the publication dates of key
papers. In fact, there are several strands to the timeline representing the contributions from physical/
fluid-dynamic, mathematical, numerical/ computational, algorithmic, and technological (computer)
developments.

The development of CFD has been a truly evolutionary process with many, many contributors and
developments on different fronts, it is virtually impossible to recognise all of them in a single magazine
issue. Necessarily, ours is a somewhat personalised view of the history of CFD and its continued
evolution.

To help develop a more rounded picture of CFD, we have engaged with several of NAFEMS’ commercial
and academic partners and collaborators to find out their thoughts on the current CFD challenges and
how CFD might evolve in the coming decades. Their collected views form the basis of two further
articles. Our commercial contributors include, amongst others, Siemens, Simulia, and Flow Science.
And we have Professors Uwe Janoske (Bergische Universität Wuppertal), Nikos Markatos (National
Technical University of Athens), Koulis Pericleous (University of Greenwich), and Spencer Sherwin
(Imperial College) to give us the academic angle.

And finally, we have illustrated the magazine with several panels of real CFD cases that demonstrate
its importance in the modern world. We hope you enjoy this special edition of Benchmark. n

David Kelsall, Steve Howell, Uwe Janoske & David Quinn

Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

EVENTS nafems.org/events

Verification and Validation in Engineering Simulation Simulation in the Automotive Industry


th
November 15 2022 December 7th 2022
Online Training Course In-person Seminar | Troy, MI, USA

Praktische Anwendung der FEM NAFEMS ASSESS Congress


und Ergebnisinterpretation March 26th 2023
November 16th 2022 In-person Conference | Atlanta, GA, USA
Online Training Course

V&V: Vérification & Validation des Simulations


Festigkeitsnachweis mit der FKM-Richtlinie pour l'ingénierie
November 23rd 2022 April 4th 2023
Online Training Course Online Training Course

NAFEMS France Conference 2022 NAFEMS World Congress 2023


rd
November 23 2022 May 15th 2023
In-person Conference | Senlis, France In-person Congress | Tampa, Florida, USA

Modelling & Simulation


to Support the Hydrogen Economy
December 6th 2022
In-person Seminar | London, UK

Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

A Designers’ Guide to CFD


These days, not every CFD user is a qualified specialist. The ever-increasing speed and
affordability of computer processing technology has led to an influx of new CFD users, who
are probably more familiar with working with CAD geometry than the details of numerical
simulation methods.

They need a guide – and this is it.


‘A Designers’ Guide to CFD’ was commissioned by NAFEMS to ensure non-expert users have
somewhere to turn. Whether you’re a designer with no simulation experience, or a structural
FEA expert who wants to know more about CFD, this is your starting point.

Topics include:
• Where is upfront CFD applicable? • Is my simulation fit for purpose?
• The CFD process • Using results to improve designs
• Project reporting and data management • Where to go for help

The book will help inexperienced CFD users feel more comfortable and in control of their
process, as well as give them a resource to turn to for further guidance and skills development.

Get 50% off the RRP


at nafems.org/cfdguide
with the code CFD22.

Offer only available online.


• Applies to printed books, not downloads.
• Offer available until November 30th 2022.
• Terms and conditions apply.

C-F1-D – Moving up
a Gear in Formula 1
2010 saw the first ever race worthy F1 car aerodynamically developed
using CFD rather than a wind tunnel. When the Virgin F1 team was
assembled, there was not enough time or budget to undertake a wind
tunnel programme, so with Wirth’s history of developing cars in IndyCar
and Le Mans prototypes in CFD, they decided to lead the aero
development with CFD. The final car was then mapped and verified in
full scale wind tunnel sessions.

nafe.ms/CFD100-1

Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

TANG, the Space


Geek, and CFD
am a space geek. Have been ever

I since I was little and the Apollo


program was on the nightly news
here in the US. I pestered my parents
into buying us Tang (the orange drink
powder that the astronauts supposedly
drank every day), something called
space food sticks (vaguely chocolatey?
vaguely food-like?), and even space ice
cream that wasn’t ice cream at all.
So, you can imagine my delight and
surprise when today (August 29, 2022),
the NASA commentators are talking
about CFD simulations of the fuel tank
bleeding that engineers are doing as
launch preparation for the Artemis I
mission. They aren’t characterizing the
techniques as something unique or
unusual, just standard tools to assess
what is happening and find solutions or
workarounds. What a great reflection of
how far these tools have come since the
days of the Apollo missions.

6
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

The Backstory Three of Artemis I’s four engines were fine; engine #3
Artemis is the US National Aeronautics and Space wasn’t. If the engine isn’t adequately prepared and the
Administration’s (NASA) program to land humans on Mars. super-cold liquids hit the ambient-temperature engine,
Artemis’s step-wise approach will help us relearn what we very bad things happen. The temperature shock can crack
knew during the Apollo days and explore the technological metal engine parts and cause sudden shrinkage. In either
advances since then. The last time humans left Earth's case, the engine fails.
orbit was nearly 50 years ago when Apollo 17 went to the
And, to top off the problems in this first launch window,
moon. That mission lasted less than two weeks; a Mars
the team discovered that frost had built up on a flange in
mission will take months to years, so almost everything
the inner stage tank. Fortunately, engineers found that this
must be re-engineered for this new objective.
wasn’t due to a crack in the tank. Investigation showed this
was due to damage in the external foam insulation. This
Artemis I is an un-crewed mission that will circle the moon
wasn’t a show-stopper since it meant there wasn’t a leak
and return to Earth. That might sound simple but it’s been
— but there were tense moments when a leak was a
so long since we’ve done this, and so much has changed
possibility.
that NASA wants to be sure the rocket (aka Space Launch
System) meets spec and that the heat shield on the Orion
As we go to print, we wait to see how the story ends. The
crew capsule can survive a re-entry through the Earth’s
Artemis I launch is delayed. Hopefully, it will have
atmosphere. Artemis II, scheduled for 2024, will also orbit
successfully launched by the time you read this. Are we
the moon, but with a vast difference: astronauts will be
ready to return to the moon? I certainly hope so!
onboard. Artemis III, in late 2025, will land on the moon
and discover whether the moon’s south pole has ice or Watching the live coverage of the launch attempt, I was
other resources that might be useful for long-term struck by how the commentating around science and
exploration. NASA says that the Artemis program will engineering had changed since I was a kid. The NASA
create a foundation for reaching further afield. announcers threw around terms like ‘simulation’ and
‘CFD’ as if their audience knew what those were. Further,
they assumed understanding of why these tools could be
The Launch used in a live launch timeframe to analyze the situation
The first launch window for Artemis I was August 29 2022. and help controllers recommend action plans. Of course,
After significant testing on everything imaginable, the as it turns out, the engineering team now has a few days to
giant rocket had already been rolled out to the launch pad troubleshoot the bleed and cooling operation, so we don’t
a few weeks ago. Using modern verification and validation need the instant simulation the announcers were talking
processes, systems had been designed, broken down to about, but still: CFD has gone mainstream.
the component level, and then simulated and tested as
components, subsystems, and full-up systems. Even so,
things go wrong. The Future is Bright
NASA has used CFD and other types of simulation from
Poor weather in Florida delayed things by about an hour. the earliest days of computing — and helped develop many
The crew then began loading the rocket’s core stage with of those codes. This issue of BENCHMARK celebrates the
super-cold liquid oxygen and liquid hydrogen. That didn’t history of CFD, which continues to play a significant role in
go as expected, and the team noticed a leak and pressure aeronautics. But it’s CFD’s future that is even more
spike. I’m not clear on how both can happen exciting. CFD has long been seen as one of the gnarliest
simultaneously, but Artemis I has several tanks, so simulation disciplines, but, if today’s broadcast is anything
perhaps this isn’t all about one tank. Those resolved, the to go by, this is changing: when a technology makes it into
team resumed filling the core stage and started filling the mainstream media and chat, it’s recognized as usable by
upper stage. many more people. Perhaps not to model super-cooled
engine bleed, but for routine, everyday problems seen by
You know that steam-like vapor that comes off the rocket designers and engineers across industries. n
before launch? That is called engine bleed; it occurs when
hydrogen is cycled through the engine to prep it for launch.

Monica Schnitger is passionate about engineering IT: CAD/CAM, CAE, PLM, AEC, IoT and the other technologies used to create the
world around us. She tries to explain what these are, how they affect product or asset creation and operations, and how businesses
can best implement these tools, to technology buyers, investors and developers. She holds a B.S. in Naval Architecture and Marine
Engineering from MIT and an honors MBA from the F.W. Olin School of Management at Babson College.

7
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Perhaps, some day in


the dim future it will
be possible to advance
the computations
faster than the
weather advances ….
But that is a dream
L.F. Richardson

8
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Richardson’s Forecast:
The Dream and the Fantasy
Peter Lynch | University College Dublin

remarkable book on weather forecasting was published just one hundred years ago.

A Written by the brilliant and prescient applied mathematician, Lewis Fry Richardson,
Weather Prediction by Numerical Process [1] was published by Cambridge University
Press and went on sale in 1922 at a cost of 30 shillings (£1.50). With a print run of just 750
copies, it was not a commercial success and was still in print thirty years after publication. It
was re-issued in 1965 as a Dover paperback. Cambridge University Press reprinted the book
in 2007, with a foreword by Peter Lynch. Described as a second edition, it differs in no
essential way from the 1922 edition.

Weather Prediction by Numerical Process (WPNP) is a strikingly original scientific work, one
of the most remarkable books on meteorology ever written. In it, Richardson described a
systematic mathematical method for predicting the weather and demonstrated its application
by carrying out a trial forecast. Richardson’s innovative approach was fundamentally sound,
but the method devised by him was utterly impractical at the time of its publication and the
results of his trial forecast appeared to be little short of outlandish. As a result, his ideas
were eclipsed for decades. For a brief biographical sketch of Richardson, see the Benchmark
article by Lea [2] and for a full biography, see Ashford [3].

9
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Background to Scientific Forecasting Earlier, Richardson had applied an approximate numerical


The basic ideas of numerical forecasting and climate method to the solution of partial differential equations to
modelling were developed more than a century ago, long investigate the stresses in masonry dams. He realized that
before the first electronic computer was constructed. this method had potential for use in a wide range of
American meteorologist Cleveland Abbe and Norwegian problems. The idea of numerical weather prediction appears
physicist Vilhelm Bjerknes recognized that predictions of to have germinated in his mind for several years. Around
meteorological phenomena could be based on the 1911, Richardson had begun to think about the application of
application of hydrodynamics and thermodynamics to the his approach to the problem of forecasting the weather. He
atmosphere [4,5]. They each specified the required process: stated in the Preface of WPNP that the idea “grew out of a
analysis of the initial state and time integration of the study of finite differences and first took shape in 1911 as a
governing equations from that state. They listed the system fantasy.” The fantasy was that of a forecast factory, which we
of mathematical equations needed and outlined the steps will discuss below.
required to produce a forecast. Realizing the numerous
practical difficulties, neither Abbe nor Bjerknes attempted Upon joining the Met Office, Richardson was appointed
to implement their techniques. Superintendent of Eskdalemuir Observatory in the Southern
Uplands of Scotland and began serious work on numerical
A first tentative trial to produce a forecast using the laws of forecasting. In May 1916 he resigned from the Met Office in
physics was made by Felix Exner in 1908, working in Vienna. order to work with the Friends Ambulance Unit in France. By
His efforts yielded a realistic forecast for the particular case this time, he had completed the formulation of his scheme
that he chose. Despite the restricted applicability of his and had set down the details in the first draft of his book.
technique, his work was a first attempt at systematic, But he was not concerned merely with theoretical rigour and
scientific weather forecasting. However, it involved some wished to include a fully worked example to demonstrate
drastic simplifications and was far from providing anything how the method could be put to use.
of use for practical forecasting.
Richardson assumed that the state of the atmosphere at any
point could be specified by seven numbers: pressure,
Weather Prediction by Numerical Process temperature, density, water content and velocity
components eastward, northward and upward. He
Lewis Fry Richardson first learned of Bjerknes’ plan for
formulated a description of atmospheric phenomena in
rational forecasting in 1913, when he took up employment
terms of seven partial differential equations. To solve them,
with the Meteorological Office. Richardson’s forecasting
he divided the atmosphere into discrete columns of extent 3
scheme amounts to a precise and detailed implementation
degrees east-west and 200 km north-south, giving 120 x 100
of the prognostic component of Bjerknes’ programme. It is a
= 12,000 columns to cover the globe. Each of these columns
highly intricate procedure: as Richardson observed, “the
was divided vertically into five cells. The values of the
scheme is complicated because the atmosphere is
variables were given at the centre of each cell, and the
complicated.” It also involved an enormous volume of
differential equations were approximated by expressing
numerical computation and was quite impractical in the pre-
them in finite difference form. The rates of change of the
computer era. But Richardson was undaunted, expressing
variables could then be calculated by arithmetical means.
his dream that “some day in the dim future it will be
These rates enabled Richardson to calculate the variables at
possible to advance the computations faster than the
a later time.
weather advances.”

10
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Richardson’s Forecast: The Dream and the Fantasy

Richardson calculated the initial changes over a six-hour the theory of meteorology provided crucial understanding of
period in two columns over central Europe, one for mass atmospheric dynamics and the filtered equations necessary
variables and one for winds. This was the extent of his to calculate the synoptic-scale tendencies. Advances in
forecast. In this trial forecast, he calculated a change of numerical analysis led to the design of stable algorithms.
atmospheric pressure, at a point near Munich, of 145 hPa in Finally, the development of digital computers provided a
6 hours. This was a totally unrealistic value, two orders of way of attacking the enormous computational task involved
magnitude too large. The failure may be explained in terms in weather forecasting, all leading to the first weather
of atmospheric dynamics. We return to the cause of this prediction by computer [6]. The history leading to the
after first considering the reaction of other researchers to emergence of modern operational numerical weather
Richardson’s work. prediction is described in Lynch, 2006 [7].

Initial Reactions to WPNP Solution of the “Noise” Problem


In the atmosphere there are long-period variations
The initial response to Richardson’s book was
dominated by the effects of the Earth’s rotation — these are
unremarkable and must have been disappointing to him.
the meteorologically significant rotational modes — and
WPNP was widely reviewed with generally favourable
short-period oscillations called gravity waves, having
comments, but the impracticality of the method and the
speeds comparable to that of sound. For many purposes,
abysmal failure of the solitary sample forecast inevitably
the gravity waves, which are normally of small amplitude,
attracted adverse criticism. Napier Shaw, reviewing the
may be disregarded as noise. However, they are solutions of
book for Nature, wrote that Richardson “presents to us a
the governing equations and, if present with spuriously
magnum opus on weather prediction”. However, in
large amplitudes in the initial data, can completely spoil the
reference to the trial forecast, he observed that “the wildest
forecast.
guess [at the pressure change] would not have been wider
of the mark.” He also questioned Richardson's conclusion
that wind observations were the real cause of the error, and The most obvious approach to circumventing the noise
his dismissal of the geostrophic wind. problem is to construct a forecast by combining many time
steps which are short enough to enable accurate simulation
Edgar W. Woolard, later an editor of Monthly Weather of the detailed high frequency variations. If Richardson had
Review, wrote a positive review of the book for that journal, extended his calculations, taking a large number of very
expressing the hope that other investigators would be small time steps, his results would have been noisy, but the
inspired by Richardson’s work to continue its development. mean values would have been meteorologically reasonable.
However, nobody else continued working along his lines, Of course, the attendant computational burden made this
perhaps because the forecast failure acted as a deterrent, utterly impossible for Richardson.
perhaps because the book was so difficult to read, with its
encyclopaedic but distracting range of topics. Alexander A more practical approach to solving the problem is to
McAdie, Professor of Meteorology at Harvard, wrote “It can modify the governing equations so that the gravity waves no
have but a limited number of readers and will probably be longer occur as solutions. This process is known as filtering
quickly placed on a library shelf and allowed to rest the equations. The first successful computer forecasts were
undisturbed by most of those who purchase a copy”. made with the barotropic vorticity equation [6], which has
Indeed, this is essentially what happened to the book. low frequency but no high frequency solutions. Another
approach is to adjust the initial data so as to reduce or
A most perceptive review by F. J. W. Whipple of the Met eliminate the gravity wave components. The adjustments
Office came closest to understanding Richardson's can be small in amplitude but large in effect. This process
unrealistic forecast, postulating that rapidly-travelling is called initialization, and it may be regarded as a special
waves contributed to its failure: “The trouble that he meets form of smoothing.
is that quite small discrepancies in the estimate of the
strengths of the winds may lead to comparatively large Lynch [7] showed that the digital filtering initialization
errors in the computed changes of pressure.” Whipple method yields realistic tendencies when applied to
appears to have had a far clearer understanding than Richardson’s data. He found that when appropriate
Richardson himself of the causes of the forecast smoothing was applied to the initial data, using a simple
catastrophe. digital filter, the initial tendency of surface pressure was
reduced from the unrealistic 145 hPa in 6 hours to a
reasonable value of less than 1 hPa in 6 hours. The forecast
Richardson’s work was not taken seriously by most was in good agreement with the observed pressure change.
meteorologists of the day and his book failed to have any The rates of change of temperature and wind were also
significant impact on the practice of weather forecasting realistic. This confirmed the root cause of the failure of
during the decades following its publication. Nevertheless, Richardson’s trial forecast: unavoidable errors in the
his work is the foundation upon which modern forecasting analysed winds resulted in spuriously large values of
is built. Several key developments in the ensuing decades divergence, which caused the anomalous pressure
set the scene for later progress. Timely observations of the tendencies. In effect, the analysis contained gravity wave
atmosphere in three dimensions were becoming available components of unrealistically large amplitudes.
following the invention of the radiosonde. Developments in

11
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

The absence of gravity waves from the initial data results Around 1926, Richardson made a deliberate break with
in reasonable initial rates of change, but it does not meteorological research. He was distressed that his turbulence
automatically allow the use of large time steps. The research was being exploited for military purposes. Indeed,
existence of high frequency solutions of the governing this knowledge impelled him to destroy a large volume of his
equations imposes a severe restriction on the size of the research papers. In a much later study, Richardson
time step allowable if reasonable results are to be investigated the separation of initially proximate tracers in a
obtained. The restriction, known as the CFL criterion, can turbulent flow and arrived empirically at his “four-thirds law”:
be circumvented by treating those terms of the equations the rate of diffusion is proportional to the separation raised to
that govern gravity waves in a numerically implicit the power 4/3. This was later established more rigorously by
manner; this distorts the structure of the gravity waves Andrey Kolmogorov using dimensional analysis.
but not of the low frequency modes. In effect, implicit
schemes slow down the faster waves, thus removing the
cause of numerical instability. Most modern forecasting Advances in Computing:
models avoid the pitfall that trapped Richardson by From ENIAC to PHONIAC
means of initialization followed by semi-implicit
The first weather forecast (technically, a hindcast) made with a
integration.
digital computer was performed on the ENIAC (Electronic
Numerical Integrator and Computer) by a team of scientists at
Princeton. The Princeton team were aware that Richardson’s
Richardson’s Later Work initial tendency field was completely wrong because he was not
After the First World War, Richardson’s research focussed able to evaluate the divergence. They realised that a filtered
primarily on atmospheric turbulence. He had system of equations would have dramatic implications for
encapsulated the essence of the cascade of turbulent numerical integration. It would obviate the problem of gravity-
energy in a simple and oft-quoted rhyme embedded in the wave noise and would permit a much larger time step to be
text of WPNP: Big whirls have little whirls that feed on used. They integrated the barotropic vorticity equation from
their velocity, and little whirls have lesser whirls and so real initial conditions and produced four realistic, if far from
on to viscosity. Richardson’s dense writing style is perfect, forecasts. For a full account, see Chapter 10 of Lynch,
occasionally lightened in this way by a whimsical touch 2006 [7].
as, when discussing the tendency of turbulence to
increase diversity, he writes “This one can believe without It is gratifying that Richardson was made aware of the success
the aid of mathematics, after watching the process of in Princeton; Jule Charney sent him a copy of the Tellus paper
stirring together water and lime-juice” (WPNP, page 101). [6]. In his response, Richardson congratulated Charney “on the
Several of his publications during this period are still remarkable progress which has been made in Princeton; and
cited by scientists. In one of the most important — The on the prospects for further improvement which you indicate”.
supply of energy from and to atmospheric eddies — he He concluded by saying that the ENIAC results were “an
derived a criterion for the onset of turbulence, introducing enormous scientific advance” on the single, and quite wrong,
what is now known as the Richardson Number. forecast in which his own work had ended.

Figure 1: ENIAC v ‘PHONIAC’ – The 'PHONIAC' (a standard Nokia 6300 from 2008) could compute a
24-hour forecast in less than one second, compared to the 24 hours it took the ENIAC, circa 1946.

12
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Richardson’s Forecast: The Dream and the Fantasy

To illustrate the dramatic growth in computer power since computations’ by signalling with a spotlight to those who are
the days of the ENIAC, one of the forecasts was re-run on a racing ahead or behindhand.
small mobile phone, a Nokia 6300, which had raw
computational power comparable to a CRAY-1, the first In 1986, an Irish artist, Stephen Conlin, created an
super-computer acquired by the European Centre for illustration of the forecast factory (Figure 2). This painting, in
Medium-Range Weather Forecasts (ECMWF). The ink and water colours, is a remarkable work, replete with
computation time for a 24-hour forecast on ENIAC was narrative details. The painting depicts a large spherical
about 24 hours. The time on the Nokia, christened Portable building with a vast central chamber. Four banners identify
Hand Operated Numerical Integrator and Computer major pioneers of computing: John Napier, Charles
(PHONIAC), was less than one second [8]. Babbage, George Boole and the first computer programmer,
Ada Lovelace. The painting is described in detail in an article
in Weather [9].
Richardson's Fantastic Forecast Factory
The computation of his forecast was prodigious, taking There are surprising similarities between Richardson’s
Richardson some two years to complete. How could the forecast factory and a modern massively parallel processor
enormous number of calculations necessary for a practical (MPP). Richardson envisaged a large number of (human)
forecast ever be done? Richardson estimated that it would computers working in synchrony on different sub-tasks. In
require 64,000 people just to keep up with the weather. In the fantasy, the forecasting job is sub-divided using domain
WPNP, he described his fantasy: a “Forecast Factory” like a decomposition, a technique often used in parallel computers
large theatre-in-the-round – think of the Royal Albert Hall – today. Richardson’s scheme involved nearest-neighbour
a circular building with a great central chamber, the walls communication, analogous to message-passing techniques
painted to form a map of the globe. A large team of (human) used in MPPs. The man in the pulpit functioned like a
computers are busy within the building calculating the synchronization and control unit. Thus, the logical
future weather. The work is coordinated by a Director of structures of the forecast factory and an MPP have much in
Operations. Standing on a central dais, he ‘conducts the common.

Figure 2: A visualisation of Richardson’s Weather Forecast Factory,


based on advice from Prof. John Byrne, TCD.
© Stephen Conlin, 1986. All Rights Reserved.

13
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Summary References
Richardson's dream was that scientific The small selection of reference here may be supplemented by
weather forecasting would one day the extensive bibliography in Lynch, 2006 [7]
become a practical reality. Modern
weather forecasts are made by calculating [1] L.F. Richardson, Weather Prediction by Numerical Process.
solutions of the mathematical equations Cambridge University Press, 1922. Reprinted by Dover Publications,
governing the atmosphere. The solutions with a new Introduction by Sydney Chapman,1965. Second ed.,
Cambridge University Press, with Foreword by Peter Lynch, 2007.
are generated by complex simulation
models implemented on powerful [2] C. Lea, “Lewis Fry Richardson: The Father of Weather Forecasting,”
computer equipment. His dream has Benchmark, pp. 36-38, January 2012.
indeed come true. Available:
https://www.nafems.org/downloads/edocs/2012_01_lewis_fry-
richardson.pdf
The development of comprehensive
[3] O.M. Ashford, Prophet—or Professor? The Life and Work of Lewis Fry
models of the atmosphere is undoubtedly
Richardson. Adam Hilger, 1985.
one of the finest achievements of
meteorology in the twentieth century. [4] C. Abbe, “The physical basis of long-range weather forecasts,” Mon.
Wea. Rev., vol. 29, pp.551–561, 1901.
Numerical models continue to evolve, with
substantial developments in data [5] V. Bjerknes, “Das Problem der Wettervorhersage, betrachtet vom
assimilation to produce improved initial Standpunkte der Mechanik und der Physik,” (transl. “The problem of
weather prediction, considered from the viewpoints of mechanics and
conditions, new numerical algorithms for
physics.”) Meteor. Z., vol.21, pp.1–7,1904. [translated and edited by E.
more precise and faster computations, and Volken and S. Brönnimann,: Meteor. Z., vol.18, pp. 663–667, 2009.]
a probabilistic approach with ensemble
[6] J.G. Charney, R. Fjørtoft, and J. von Neumann, “Numerical integration
forecasts that quantify uncertainties in an
of the barotropic vorticity equation,” Tellus, vol.2, pp. 237–254, 1950.
operational environment. These
developments have made the dreams of [7] P. Lynch, The Emergence of Numerical Weather Prediction:
Richardson’s Dream. Cambridge University Press, 2006.
Abbe, Bjerknes and Richardson an
everyday reality. Meteorology is now firmly [8] P. Lynch and O. Lynch, “Forecasts by PHONIAC,” Weather, vol. 63, pp.
established as a quantitative science, and 324-326, 2008.
its value and validity are demonstrated [9] P. Lynch, “An artist’s impression of Richardson’s fantastic forecast
daily by the acid test of any science, its factory,” Weather, vol. 71, pp. 14-18,
2016
ability to predict the future. n

Prof. Peter Lynch is passionate about all things mathematical. He graduated from University College Dublin (UCD) in
1968 with a first class honours in mathematical science. The following year he was awarded an M.Sc. by UCD.
Much of Peter's career was spent with the Irish Meteorological Service, where he worked developing models for weather
prediction. In 1982 he was awarded a PhD from Trinity College Dublin, for research in dynamical meteorology. He later
became Head of Research and then Deputy Directer of Met Eireann.
Peter carried out extensive research on the development of computer weather forecasting. In 2006 he completed a
monograph, The Emergence of Numerical Weather Prediction: Richardson's Dream, which was published by Cambridge
University Press.
Peter moved to UCD in 2004 as Professor of Meteorology in the School of Mathematics. He is now an emeritus professor
in the School. Since retiring he has written extensively about mathematics. His first mathematical collection, That's
Maths: The Mathematical Magic in Everyday Life, was published by Gill Books in 2016. He writes a regular mathematical
column in The Irish Times and maintains a mathematical blog, thatsmaths.com. His professional website is at
https://maths.ucd.ie/~plynch/
Peter is a keen walker. Over a thirteen-year period, he completed a walk around the coastal counties of Ireland. This is
described in his book Rambling Round Ireland: A Commodius Vicus of Recirculation, published in 2010 by The Liffey
Press.
Peter is a Member of the Royal Irish Academy.

14
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

learn from anywhere.... always


online training that suits how you work

1st Metals Material Modelling: Welding Simulation and Residual Stresses


3rd 10 Steps to Successful Explicit Dynamic Analysis
4th Introduction to Practical CFD
16th Basic Electromagnetic FEA
17th Practical Understanding of Systems Modelling and Simulation
28th Next Steps with Multibody Dynamics Simulation

work 2nd Elements of Turbulence Modelling

home
anywhere nafems.org/e-learning
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

London’s Streets
are Paved with CFD
Developers and architects need to consider the complex nature of the wind when designing
tall buildings in environments like the City of London. With so many of these buildings in a
tight space, all with unique shapes and properties, how the buildings interact with each
other to affect wind flow is important. Wirth Research helps many organisations better
understand the complex nature of wind around and behind tall buildings, with aerodynamic
expertise and high resolution CFD modelling. Rarefied air indeed!

nafe.ms/CFD100-2

Cooling your Lights


LED lighting uses 80% less energy than standard
incandescent bulbs. Despite this, LED light bulbs are
facing a new set of thermal challenges because their
durability is very much dependent on their operating
temperature. Minimising the maximum temperature of
the diodes can vastly improve their expected lifespan.
CFD consultants at 80/20 Engineering assisted engineers
at Microlights in their adoption of LED technology by
rapidly assessing a series of proposed designs for heat
sinks. After gaining a series of insights by closely
examining the predicted air flow patterns and
temperature distributions, the final design reduced the
maximum temperature by 29OC while also reducing the
load on the cooling fan.

Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

The Future of CFD


The Academic Opinion
s well as teaching the next generation of engineers and analysts, the academic

A community can also help shape the very industry and technology they are
preparing students to enter. It seemed wise, therefore, to ask questions and
listen to the voices of experience when looking at what the future holds for CFD. Whilst
industry is often, necessarily, driven by profit and the bottom line, the academic
community can give some perspective on what is likely to happen in the future, without
having to stick to a company line or answer to shareholders.

We spoke to some of the heavyweights of the academic CFD community and got their
thoughts on where CFD is heading in the future.

Koulis Pericleous
Professor of CFD
University of Greenwich

Nikos Markatos
Professor Emeritus, Department of Chemical Engineering
National Technical University of Athens

Spencer Sherwin
Head of Aerodynamics and Professor of Computational
Fluid Mechanics | Department of Aeronautics
Director of Research Computing Service
Imperial College London

Uwe Janoske
Chair of Fluid Mechanics
University of Wuppertal

17
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

What will CFD look like in the future?

Koulis Pericleous Nikos Markatos


Mesh-based methods will continue to I believe CFD will evolve in three main directions: stresses in
evolve with automation ensuring they solids, multi-phase flow, and the fluid population dimension,
become an integral part of CAD/CAM especially for reacting flows.
technology, ideally with a smooth
transition between fluid flow and stress Stresses in Solids
analysis domains. Meshless methods The Finite Volume Method (FVM) has been used for solving
such as LBM/SPH still have their place solid-stress problems by many authors [Beale, Elias 1991;
and will continue evolving especially Spalding, 1993; Demirdzic, Muzaferija 1994; Bailey, Cross, Lai
where moving boundaries exist– such as 1995 and more recently Artemov], whether or not they interact
in a case I am involved in which features with fluid- or heat-flow ones. Therefore, in my opinion, the
evolving microstructure dendrites widely-held belief that the Finite Element Method (FEM) must
surrounded by solute flow– or in the be used for solid-stress problems is demonstrably false. This
handling of dense particle flows. belief has dissuaded the majority of stress-analysis
researchers from paying any attention at all to FVM. It is this
Regarding turbulence; depending on the academic’s opinion that FVM is inherently superior, requiring
application, traditional two-equation only one function (that of the variable-distribution shape) to be
models will have their place where fast guessed, not two (i.e., the weighting function in addition).
results are required. DNS will continue as
a fundamental research tool with I think that the use of two functions by FEMists has needlessly
simulations moving up the Reynolds complicated the language and literature of FEM. It represents
number scale as computing technology needless baggage carried in from pre-computer years, with no
advances (quantum computing perhaps?). advantage whatsoever. To me, the enormous and expensive
Large Eddy Simulation (LES) will remain effort devoted to creating the finite-element literature
the workhorse in research-based represents a profligate and still-continuing waste of resources.
simulations, mostly carried out by
academics. In my opinion, FEM will disappear in the next ten years.
Accessibility (of CFD) is already very Whatever weighting function policy one adopts in any particular
advanced, with myriad commercial codes problem, the same solution should be arrived at, just as Athens
available to industry, capable of running is the same city whether reached by train or plane.
on modest PC- type computers. Ease of
use is excellent, the danger is that results Multi-phase flow
can be obtained by inexperienced users Much remains to be understood for multi-phase flows.
without much understanding of the Research opportunities in respect of free-surface flows are
background physics, numerical plenty:
limitations, etc.
• Fitting the grid to the surface is rarely practical; surface
shapes are too convoluted. The motion must be defined by
Anyone can run a CFD simulation, using
reference to a predetermined grid.
freely available open-source software
from website depositories such as • A two-phase model may be used- but numerical diffusion
GitHub. Education is the only limitation makes the surface fuzzy.
for potential users.
• Particle tracking is useful- but algorithms vary greatly in
efficiency.
“Digital twins” seem to be the order of
the day. The machine operator accesses • The Volume Of Fluid (VOF) scalar-equation method has
background CFD simulations, perhaps many advocates, and variants. Improvements are still
without being aware of it. Rather needed, e.g., for multiple layers.
optimistic in use at present but becoming
more realistic in the future. Another scalar-equation method, called ‘Level-set’, can
produce spectacular results. Computer simulation of
As for digital twins, Machine Learning/AI dispersed-flow phenomena is always based on the neglect of
seems to be the way technology transfer some of the features of the real situation. For example:
works between CFD and the operator.
Much of current CFD research is devoted • although in fact bubbles of many different sizes exist at a
to such methods. particular location in a boiler, they are usually supposed
all to have the same size there; and
General purpose codes and application • although some coal particles have greater velocities than
specific codes both have their uses, in the others at a particular place in a furnace, the differences
same way that a specialised tool for a are disregarded.
particular job is preferable to a generic
tool. Differences are due to accessibility, • These presumptions make it possible to regard the true
cost, and user capability. multi-phase mixture as being a two-phase one.

Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

The Future of CFD - The Academic Opinion

The fluid population dimension


The Eddy-Break-Up (EBU) model of 1971 explained Scurlock’s unaccountable turbulent-flame
findings (1948) by presuming the burning gases comprised a two-component population,
consisting of wholly un-reacted gas fragments, too cold to burn, and hot fully-reacted gas
fragments, which also could not burn. These collided at a rate proportional to their volume-
fraction product and to the turbulence intensity, producing intermediate gas which could burn
instantly.

The EBU became popular and is still widely used. Calculations also allow determination of how
many fluids are needed for accuracy. The analogy with spatial-grid-refinement tests is very
close.

To date, fixed, uniform, and structured population grids have been used. In the future, I believe
people will extend to them our knowledge of moving, non-uniform, unstructured, problem
adaptive, and other sophisticated geometric grids.

Concerning turbulence modelling


“All turbulence modellers must follow the Kolmogorov pattern, viz by solving equations for
statistical-averages such as: k, Ɛ, vorticity fluctuations, Reynolds stresses, etc.”

In my opinion, this is untrue, although most modellers believe it. Modish variants such as
Large Eddy Simulation (of which there are many) may create the illusion of novelty.

Such models perform badly when body forces act differently on, say, hotter and colder
elements in the turbulent mixture, as, in forest fires, for example. Fluid Population theory is
probably the best way forward

General-purpose codes
I think that general-purpose codes will survive in the future, but mainly out of sight. Instead,
CFD apps will occupy attention. CFD apps apply CFD to classes of equipment, i.e., Simulation
Scenarios, via application-specific menus. App users need to know much less about CFD than
general-purpose code users.

Why? General-purpose CFD codes simulate many classes of scenario; users need just one. To
particularize a general-purpose code requires specialist skills that users can ill afford to learn.

What? A CFD app is a one-scenario-class user- interface. Its creators provide the
particularization. CFD-apps ask only for inputs that users know about in application-specific
language e.g., ‘air-change/hour’. CFD-apps create grids without user intervention and set
numerical parameters likewise. CFD-apps supply results-displaying macros and automatically
write results-interpreting reports.

Currently, users choose one service provider, pay significant money, and get more than they
need, which they may think is the whole of CFD, but which is very often less than they need.
They must themselves create the grids and make other numerical settings- optimal or not- run
the code, and display and interpret the results.

In the future, users will choose the app they need, pay less, and for no more than they need.
They will have access to the whole of CFD, rely on settings made by the creator of that
application, but may use different creators for other applications.

Use of AI in CFD Calculations


I expect that AI will play an increasingly significant part in the future, not so much in the
scientific part of CFD but predominantly in the classification of results obtained and in deriving
conclusions by training thousands of data obtained in the past. One example of this could be in
the case of forest-fire spread predictions dependent on weather conditions. Currently, such
predictions cannot be made in real-time as they require days of CPU time. In the future, one
might be able to run many cases under different weather conditions and train the data obtained
so that when a real case happens one can derive the appropriate scenario quickly. Artificial
neural networks have already been used for several years now.

19
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Spencer Sherwin Uwe Janoske


Over the next 20 years, I think there will still be a The complexity of the problems will increase
demand for mesh-based methods at least for certain over the next years dramatically. CFD
problems. Where boundary layer flows are influenced by engineers will face more and more coupled
the geometry and need very near wall resolution, a multi-physics problems arising from new
mesh-based method. Meshes methods (LBM, Immersed technologies in areas such as the
Boundary Method) face their biggest challenges in this transformation of energy supply, the
region. automotive industry, and the response to
climate change. For example, the reduction of
I like to think of how turbulence will be handled as levels combustion engines will not reduce the
of fidelity. As we can run higher fidelity more complex number of CFD engineers, it will in fact
models that can be run at higher Reynolds numbers, increase the number of new applications and
they can be used to train lower fidelity reduced models challenges in heat management of electrical
of turbulence, helping to address the issue of when and drives, air conditioning in cars, water
where to apply turbulence models. Our discussions with management in fuel cells, etc.
one industry are not that higher fidelity/more advanced
methods will replace the existing tools (i.e., RANS) for The determination of flow fields will be as
design, but rather they need a set of tools to understand important in 100 years as it is today. But the
the flow physics in new or more complex problems. So question will be whether or not we stick to
again, it is context driven as to what you need from the the methods proposed 100 years ago. Will we
model and how important getting the localised physics still use Finite-Volume Methods and use only
might be. larger HPC systems to improve simulation
times? Will we still spend hours of our time
If you can use RANS then I think we will see CFD being creating high-quality computational meshes?
run in real-time. If you need to resolve a complex physics In my opinion, no. There will be a shift from
problem, we need to get it to a point to run it overnight. the “classical” methods we know in CFD
today to approaches motivated by data
In terms of technologies, I think FV will prevail for RANS, science and machine learning. In industrial
URANS and LBM overlap in different industries, it’s not applications, a lot of knowledge is present
clear to me why one will prevail over the other. SPH is which we have to use in data science and
more for multiphase flow or complex physics of this machine learning technologies.
nature.
We will use machine learning incorporating
I think there is a role for ML/AI. If we interpret its use as the Navier-Stokes equations as a “loss”
embedding expert knowledge into a Neural Network function to improve teaching without the
there is a big demand for this in Meshing practices, knowledge of any test data (Physics-Informed
turbulence modelling, and even the selection of the best Neural Networks (PINN)). We will get rid of
preconditioners. I believe we will see ML/AI used in this generating computational meshes or, if we
context. The adoption of Physics lnformed Neural need them, machine learning will improve the
Networks is claiming a lot of visibility. I am sure it will meshes for us. Perhaps the vision of
replace challenging forward problems but it could well integrating CFD in the design processes will
open up inverse problems or searching of parametric become true. If the designer modifies the
space problems, or re-using the wealth of previous geometry, probably the pressure drop will be
simulations that are generally not currently leveraged. shown in real-time on the screen.

20
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

The Future of CFD - The Academic Opinion

What will be the limitations of CFD in 20 years?


What will be the barriers to CFD and its development?

Koulis Pericleous Spencer Sherwin


Spectral collocation methods offer the For the standard problems, a lot of the formulation mathematics is well
best promise for accuracy, especially understood. In combustion and hypersonic, perhaps some of the physics
for time-dependent variable geometry modelling is incomplete. I think however the mathematics of new
simulations. Currently too costly in methods (preconditioning, time stepping) to capitalise on emerging
computer resources especially in 3D, architectures might need more consideration, as well as ideas such as
they are likely to find more parallel -in- time if we find we have too much compute for a given
applications with hardware advances. problem.

Multiphysics are likely to be dominant We currently have plenty of methods, and if history is anything to go by,
in new developments, e.g., for we will likely keep, as a community, most of these methods since it is not
aeroelasticity, MHD, materials clear one method is best at all problems and fidelities.
processing, aeroacoustics, and
ultrasonics. Handling the various The use of multi-physics coupling, fluid-structure interaction, transport of
complex interactions will be the different species (hydrogen) with the flow problems, and the interfaces of
challenge. Massive parallelisation is those species at physical boundaries are all more challenging problems.
the current route to billion cell
simulations. Quantum computing (if This is also a clear challenge in hardware and software, as most top-end
and when) presumably will cause its (Tier 0) supercomputers are GPU-based accelerated architecture and if
own problems in programming and this trickles down in the future, we have to evolve our software to utilise
code development. these resources.

How will we ensure that simulation predictions are fit for purpose?
How will this change in the next 20 years?

Koulis Pericleous Spencer Sherwin


Verification and validation have always been Now we can simulate at the lower end of industrially relevant
important, but they will become vital as CFD flow regimes, verification and validation are increasingly
users become more and more removed from important. CFD can genuinely replace experiments in a range
CFD developers. of conditions but how do we learn to trust the results? I have
cases where the simulation is too/artificially quiet and this is
In terms of code verification, you have to trust only apparent through validation. Since the baseline is
the dominant CFD vendors to get it right at the defined by the experiments, we have to address how to input
development stage. A trustworthy library of test an appropriate amount of noise to recapture the
cases should be available, be compulsory, and experimental conditions to grow confidence in these tools.
be controlled by an international body acting as The first step is, obviously, verification but validation will also
arbiter, and with the authority to issue a quality be important. The reproducibility of codes is also likely to be
certificate (as we do for wine!). important.

21
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

What teaching and training will the CFD sector need in 20 years?
What should the university syllabus include for CFD?

Koulis Pericleous
CFD courses should be compulsory in Engineering and Environmental courses. When I say CFD, I don’t mean just the use
of CFD codes as black boxes, but the engine behind, the maths, physics, discretisation techniques, understanding of
accuracy, and iteration process.

Spencer Sherwin
Although it seems most students are computer literate, this is masked by an iPad/button- clicking interaction with
computing. To keep developing the tools of the future we have to do more to cover the basic computing skills that were
perhaps picked up organically previously. I do not think we need to be doing a lot of teaching on how to run tools, there
are plenty of online tutorials, blogs, and data that can be found on the web. However, we do need to teach students about
what the tools are solving so they can understand how best to apply CFD tools. We also need to teach good coding and
software engineering practice to keep evolving the CFD tools of the future.

Uwe Janoske
In academic teaching, there is a shift from the teaching of fundamental subjects like mechanics, thermodynamics, and
fluid mechanics to a larger portion of soft skills. On the other hand, the applications are becoming more and more
complex, which requires a deep understanding of the physics behind them. Therefore, it is essential that new
technologies assist the engineer in evaluating the quality of the results. Otherwise, CFD will again be referred to as
“Colours For Directors”, like it was in its early days. n

Keeping COVID out of Classrooms


In the wake of the COVID-19 pandemic, effective ventilation in schools has become an essential
part of creating and maintaining a healthy environment for teaching and learning. This case shows
the value of installing an Ultra-violet unit into every classroom as a low-cost effective solution to
remove virus from the air. A video narrated by the students at a South London primary school can
be found here: nafe.ms/CFD100-3. The work was sponsored by multiple stakeholders including
IMechE, ESI-OpenCFD, OpenFOAM, St Teresa’s School Morden, Department for Education, Innovate
UK and presented to UK Parliamentarians at Portcullis House in July 2022.

22
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Modelling & Simulation


to Support the Hydrogen Economy
Seminar 06 December 2022 London UK

 Hydrogen will play a critical role in achieving net zero emission targets.

 Demand for hydrogen-powered vehicles and airplanes is driving


technological breakthroughs and advancements.

 Massive infrastructure across many industries will be required for hydrogen


production and transportation, both offshore and onshore.

Modelling & Simulation


has a key role to play!
This NAFEMS seminar aims to bring together modelling & simulation experts from academia, industry, and research &
technology organisations to share know-how and best practices on modelling the effects of hydrogen on the structural
properties and behaviour of materials and demonstrate the value of numerical modelling on practical use cases.

Among other topics, there will be a particular focus on the areas below:
 R&D trends in modelling hydrogen transport/permeation through metals and composites.

 Advances in modelling hydrogen-material interactions and related degradation mechanisms.

 Recent developments in manufacturing process simulation of parts for hydrogen service, including additive manufacturing.

 Trends in digital certification and multiscale modelling-based qualification of products for hydrogen applications.

 Developments in predictive models for lifetime prediction and fitness-for-service of assets in hydrogen service.

The shared determination by key stakeholders in the energy value


chain for a clean and sustainable energy transition has never been
stronger, nor has the need for cross industry working.

Register Today nafems.org/events


Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

CFD - A Timeline
From the 17th Century
to the Present Day
David Kelsall, Steve Howell & Uwe Janoske
NAFEMS CFD Working Group

omputational Fluid Dynamics (CFD) is an important tool for product and

C process development in the modern computer-aided engineering (CAE)


armoury. By encapsulating the fundamental flow equations, CFD computer
applications can simulate the free flow of fluids within and around all manner of
objects including vehicles, machinery, products, and processes. Modified
simulations facilitate improvements to the design and operating effectiveness of
the machine or systems – typically by mitigating adverse flow and heat interactions.

24
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

CFD is now big business. Around the multinational corporate key players, a rich ecosystem
of customers, smaller companies and specialist consultants exist to service end-user
needs in industry sectors ranging from automotive, aerospace and defence, and electrical
and electronics to industrial machinery, energy, material and chemical processing, and
many more.

The emergence of a global market that serves virtually all engineering sectors and
geographic regions illustrates the importance of CFD to the engineering value chain. What
drives the CFD market is the need for competitive and superior products across all sectors
coupled with a demand for knowledge of how processes and designs will perform before
they are built.

Some of the social, technological, economic, environmental, and political (STEEP) factors
that will likely influence CFD developments in coming years include:
• Social: Post pandemic turmoil, transport preferences, increase in remote working/
working from home (WFH), green agenda (energy saving), food and fuel poverty.
• Technological: Remote working, even more fuel-efficient travel, EVs, Cloud computing.
• Economic: Inflation, green agenda (decarbonisation, renewable energy).
• Environmental: Climate change – urgent need for mitigation.
• Political: Increased international tensions polarising nation states.

The worldwide CFD market seems to be in a strong position following the COVID-19
pandemic. Recent market trend reports suggest it is currently worth about USD 2 billion
and is forecast to have a compound annual growth rate (CAGR) of 8-12% in the coming 5
years [1,2,3] .

Although, the foundations for the market were laid by Newton’s Laws of Motion over 300
years ago (Philosophiæ Naturalis Principia Mathematica, 1687), it was the invention of the
programmable computer in the 1940s that began to unleash the powerful tool that we call
CFD today.

In a sense, the development of CFD has been an evolutionary process with many, many
contributors. Here, we have tried to set out a timeline for this evolution. We realise that it
is nigh on impossible to be fair to all people that contributed to the development of CFD –
so it is a somewhat personalised view of the authors. We have tried to pick out the key
stages/phases. While the work of Lewis Fry Richardson (LFR) is seminal to the foundation
of CFD, LFR stands on the shoulders of giants.

From The Enlightenment in the 17th century, and the discovery of calculus with Isaac
Newton (1642-1727) applying calculus to formulate the Laws of Motion and, independently,
Gottfried Leibniz (1646-1716). The 18th century saw new methods in calculus developed by
some of the greatest mathematicians in history, such as brothers Jakob Bernoulli (1654-
1705) and Johann Bernoulli (1667-1748), as well as Leonhard Euler (1707-1783), whose
equations describe the conservation of mass and momentum for an inviscid fluid, Joseph
Louis Lagrange (1736-1813), and Pierre Simon Laplace (1749-1827). Son of Johann, Daniel
Bernoulli (1700-1782) derived the famous flow equation.

And this is where we start our timeline. If you have any comments, additions, notice any
glaring omissions, or would like to add any other thoughts, please get in touch at
benchmark@nafems.org

David Kelsall & Steve Howell.

25
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

1922 1932
GOVERNING 1822
EQUATIONS/ 1845 1877 1895 1908 1925

TURBULENCE
FINITE DIFFERENCE
1908 1910 1922 1925 1928
NUMERICS

1822/1845: Navier-Stokes
equation describing fluid flow 1910: Richardson demonstrates an approximate
is first published. arithmetical solution to differential equations by
the finite difference approach, applying the
method to the stresses in a masonry dam.

1877: Boussinesq proposes relating


turbulence stresses to the mean flow to 1922: Richardson publishes Weather Prediction
close the Navier-Stokes equations, which by Numerical Process (WPNP).
leads to the concept of eddy viscosity. It includes an example weather forecast which is
the first ever published CFD simulation.

1895: Reynolds proposes ‘Reynold’s


decomposition’ to partition an instantaneous 1928: Courant, Friedrichs
quantity into its time-averaged and fluctuating 1925: Prandtl proposes the mixing and Lewy identify the CFL
components. This leads to the length hypothesis to describe the criterion that stipulates
Reynolds-averaged Navier-Stokes (RANS) local length scale of turbulence information cannot flow
equation which describes the time-averaged for several common free shear past a single compute cell
behaviour for turbulent flows. flows. in a single timestep.

1925: Richardson publishes a paper


1908: Prandtl publishes his on how to solve differential equations
boundary layer theory approximately by arithmetic.

1908: Richardson publishes a method


to solve differential equations using a
freehand graphical approximation.

26
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

CFD – A Timeline | From the 17th Century to the Present Day

1942 1952
1941 1945

1933 1940 1941 1944

1943 1950
FINITE ELEMENT

1943 1945
HARDWARE

1933: Thom publishes the 1943: Courant publishes a paper


! on 1945: Chou publishes a paper on
first numerical solution for variational methods for the solution of velocity correlations and the
flow past a cylinder using a problems of equilibrium and ! vibrations. Like solutions of the equations of
stream function-vorticity the 1941 paper of Hrennikoff, this uses finite turbulent fluctuation which presents
formulation. elements and the two papers ! constitute the the transport equation for the
birth of the finite element method. Reynolds stress tensor.
!
1940: Southwall publishes !
his relaxation methods for
engineering science. 1943: Colossus is a set of computers
! codebreakers
developed by British
during the second world war. Colossus 1950: Charney, Wexler, Neumann,
is widely regarded!as the world's first Frankel et al undertake the first
1941: Hrennikoff publishes a paper programmable, electronic, digital numerical weather prediction
computer. !
on the solution of problems of (16x16x3 mesh with dx = 300km
elasticity by the framework method, ! and 48 time steps (dt = 30 mins).
which is essentially the finite element
method. 1944: Von Neumann ! develops a
simple finite difference method.
!
1941: Kolmogorov publishes his work 1950: Von Neumann and
on the local structure of turbulence in 1944: Bethe and Feynmann
! report the
Richtmeyer publish their
first ‘hydro’ calculation in Los Alamos
incompressible viscous fluid for very concept of artificial viscosity.
Report (LA-94). It!is possibly the earliest
large Reynolds' numbers, which
introduces the idea of the turbulence numerical solution involving
energy spectrum and a transport shockwaves. !
equation for w, the specific rate of
dissipation of turbulence kinetic energy. !
1945: ENIAC is completed and commissioned. Designed and primarily
used to! calculate artillery firing tables for the US Army’s Ballistic
Research Laboratory, ENIAC is considered the world’s first
programmable,
! electronic, general-purpose digital computer.
(Although Colossus was completed two years earlier, it was
programmed
! by switches and plugs rather than by a stored program.)

Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

1952 1962
GOVERNING EQUATIONS/TURBULENCE
1956 1963 1967

NUMERICS
FINITE DIFFERENCE
1953 1963 1965 1967

FINITE ELEMENT

HARDWARE

1952: The T3 lab at Los Alamos 1963: Fromm and Harlow publish what is thought 1968: Chorin presents a
National Laboratory (LANL) to be the first time-dependent incompressible method for the solution of the
receives its first electronic Navier-Stokes solution for the vortex street in the Navier-Stokes equations
computer, the IBM 700. wake of a square cylinder, complete with a based on the artificial
realistic prediction for the shedding frequency. compressibility approach.

1953: Kawaguti publishes a numerical solution 1963: Smagorinsky proposes large eddy
of the Navier-Stokes equations for the flow simulation (LES) as a modelling approach
around a circular cylinder at a Reynolds number for turbulence for atmospheric air flows.
of 40. A monumental personal effort, Kawaguti
calculated this flow with a mechanical desk
calculator, working 20 hours per week for 1965: Harlow and Welch publish the Marker-and-Cell
eighteen months. (MAC) method for calculation of time-dependent
viscous incompressible flows with a free surface.

1953: Harlow develops the Particle-in-Cell (PIC) 1965: CFD is brought to the attention of the
method for calculating fluid flows. It tracks wider public by Fromm and Harlow in their
Lagrangian marker particles of constant mass article Computer experiments in fluid
through a fixed Eulerian mesh. dynamics, published in Scientific American.

1967: Hess and Smith publish their


panel method for the calculation of
potential flow about arbitrary bodies.
1956: Smagorinsky introduces the first LES
subgrid scale turbulence model. It emerged
following Charney’s suggestion that numerical
instabilities in earlier weather model predictions 1967: Cebeci and Smith publish their two-layer algebraic turbulence
of Philips could be controlled using something model.!Nowadays dubbed a zero-equation model, because it does not
like Von Neumann’s artificial viscosity. require the solution of any transport equations.

28
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

CFD – A Timeline | From the 17th Century to the Present Day

1972 1982
1969 1970

1968 ~1970

FINITE VOLUME 1981

1973 1977 1978

SMOOTHED PARTICLE
HYDRODYNAMICS 1977

1973

1974 1980 1981


SOFTWARE
1976 1980
BOOKS
1973: Strang and Fix 1977: Diaz and Wheeler separately
1968: Rubbert and Saaris extend publish a hybrid collocation Galerkin
publish their mathematical
the panel method for lifting flows. method (HCGM). Subsequently, these
basis for the finite element
works will be identified as precursors
to the Spectral method.
1969: Lundgren introduces the concept 1974: Spalding founds CHAM in
of probability density function (PDF) London, widely recognised as the
methods for describing turbulence. first commercial CFD company in
the world. 1978: Young publishes a
Lobatto-Galerkin method and
uses it for simulating a reservoir.
1973: The patent for ENIAC is
1970: Orszag coins the phrase direct invalidated, thus allowing the
numerical simulation (DNS) in his electronic digital computer to
paper on analytical theories of 1980: While at LANL, Hirt founds
reach the public domain.
Flow Science, the first commercial
CFD company in the US and
~1970: The upwind differencing concept developer of the FLOW-3D code.
1976: Roache publishes perhaps
re-emerges at Imperial college to improve the first text book for CFD.
numerical stability for high Reynolds’
number flows, which also leads to the IC
formulation of the finite volume method. 1980: Patankar publishes his book Numerical
heat transfer and fluid flow, presents much of
the work developed at Imperial College
1972: Patankar and Spalding publish the through the preceding decade.
SIMPLE algorithm which uses the concept
of pressure correction to enforce mass 1977: Gingold and 1981: Hirt and Nichols publish a paper on the
continuity. Monaghan, and Lucy volume of fluid (VOF) method for the dynamics
(all from Cambridge, of free boundaries. This builds on earlier work
UK) introduce the SPH at LANL by Noh and Woodward.
1972: Jones and Launder publish their approach, initially for
k model for turbulence, nowadays astrophysical problems. 1981: CHAM releases PHOENICS, the first
dubbed the standard k model. commercial general-purpose CFD code.

29
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

1982 1992
GOVERNING EQUATIONS/TURBULENCE
1986 1995 1997

NAFEMS
founded ERCOFTAC
NAFEMS
founded
NUMERICS CFDWG
1990
FINITE VOLUME
1983
FINITE ELEMENT/SPECTRAL METHODS
1984
LATTICE BOLTZMANN
1986 1998

SPH

HARDWARE

SOFTWARE
1983 1985 1986

BOOKS 1984 1988

1983: Swithenbank, Boysan and Ayers from 1986: While at Imperial College, Gosman 1992: Spalart and Allmaras
the University of Sheffield secure funding and Issa found Computational Dynamics publish their one-equation
from Creare to develop their Tempest CFD to develop and launch a new commercial turbulence model, which has
code. This would become FLUENT. CFD software, STAR-CD. become popular for wall
bounded flow applications.

1983: Rhie and Chow publish their 1986: Yakhot and Orszag publish
interpolation technique. Until this point, many their version of the ke model 1992: Alexander, Chen, Chen
CFD codes had been formulated with a based upon Renormalisation and Doolen publish their
staggered grid arrangement, but Rhie-Chow Group (RNG) theory. Lattice Boltzmann model
interpolation would subsequently allow codes for compressible fluids.
to shift to a collocated arrangement.
1986: Frisch, Hasslacher and Pomeau
publish a paper on the lattice gas
1995: Shih, Liou, Shabbir,
1984: Patera publishes a spectral element method for solving the Navier-Stokes
Yang and Zhu publish their
method for fluid dynamics. Widely credited as equation on simple, massively
realizable formulation of the
the starting point for spectral methods, this is parallel computing machines.
ke turbulence model.
a rediscovery of the methods used by Diaz and
Wheeler in 1977, and Young in 1978.
1988: Wilcox self-publishes his 1997: Spalart proposes the
book on turbulence modelling for detached eddy simulation
CFD, which leads to a reawakening (DES) method, a hybrid
1985: AEA Technology, based at Harwell in the of interest in the kw model. approach which uses LES in
UK, start development of the CFD code
free flow regions and reverts
FLOW3D. It would later be renamed CFX to
to RANS in near-wall regions,
avoid confusion with the FLOW-3D code 1990: Hess presents an overview where the mesh required for
developed by Hirt in the US. of Panel methods for CFD in the LES may be prohibitive.
annual review of fluid mechanics.

1984: Anderson, Tannehill and Pletcher 1998: Chen and Doolen present an
publish the first edition of their classic CFD 1992: Chen, Wang, Shan and overview of the Lattice Boltzmann
book: Computational fluid mechanics and heat Doolen publish their paper on method for fluid flows in the annual
transfer. Lattice Boltzmann CFD in 3D. review of fluid mechanics.

Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

CFD – A Timeline | From the 17th Century to the Present Day

2002…

1989: Numerical computation of 2004: The FOAM code, developed at PowerFlow XFlow Palabos
internal and external flows, by Imperial College by Henry Weller, is
Hirsch. released into the public domain as an
open source CFD code, OpenFOAM. OpenLB FlowVision OpenFVM

1993: An introduction to CFD, by


Versteeg and Malalasekera. 2004: CD-adapco launches its new Code Saturne ADINA CFD
unstructured CFD code, STAR-CCM+.

CAD-based CFD codes


1995: Computational fluid Altair Acusolve finite-element
based CFD solver 6sigma
dynamics, by J Anderson Jr
ReFRESCO CFD, for
COMSOL finite-element maritime applications.
based CFD solver FloEFD
1996: CFX 5 is released. This is also an
unstructured CFD solver but it
represents a departure from earlier PyFR higher-order CFD, in
versions of CFX, moving to an FE-FV Abaqus finite-element based CFD solver
collaboration with Z-CFD.
framework and utilising a coupled solver.
1998: Ferzinger and Peric launch
their CFD book, Computational Dolfyn OrthoFlo
1998: FLUENT 5, a new general-purpose !
Methods for Fluid Dynamics.
unstructured CFD solver is launched. Like
earlier versions of FLUENT, it is based on the SmartFire
FDS
finite volume method. 2010: Oberkampf and Roy publish
It is shipped with the GAMBIT pre-processer their book on verification and
(Geometry and Meshing Built-In Toolbox). validation for scientific computing.
FLACS KFX

This timeline is not intended to be an exhaustive database of the CFD industry to the present day, it is intended to highlight the major contributions over the
last 100 years. If there is something missing from the timeline that you think should be identified as a major contribution, please email
benchmark@nafems.org. This timeline is intended as a starting point and may be revised in future.

The final page of this timeline is intended to highlight the explosion in the use of CFD, specifically relating to CFD software and books.
We acknowledge that the above is not an exhaustive list of software codes and books – apologies if your book or CFD code is not included.

31
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only


 
| 
 
nwc23.org
CALL FOR PAPERS

Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Welcome to a World
of Engineering Simulation.
The NAFEMS World Congress 2023 in Tampa, Florida, USA, will bring the global engineering simulation community
back together again in person, from 15-18 May. We'll be pushing the technology forward creating a cross-industry,
cross-technology exchange of ideas, best practice, and information like never before, whilst getting down to the
business of what we all strive for; making simulation ever better and ever more accessible.

NAFEMS WORLD CONGRESS 2023


Bigger. Better. Bolder.
Simulation is now right at the forefront of the product design process - and we're bringing it right to the top of your
agenda with three days of outstanding simulation content, delivered by experts and thought-leaders from the
community, ushering in the next generation of simulation.

We Want You!
The community wants to hear your story, your experience, and your message. Every major software vendor,
industrial user, industry guru, and technology expert will be part of it - and you belong with them.

Be part of something truly special -


submit your abstract today!

Submission Deadline November 25th 2022

nwc23.org
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

CFD in
Undergraduate
Curricula
Kamran Fouladi | InfoMec Consulting

or over two hundred years, the solutions to fluid dynamics

F challenges were restricted to analytical solutions to some


restricted or simplified problems. These solutions solved
equations that were first introduced by fluid dynamics giants such
as Bernoulli, Euler, and Prandtl. These theoretical endeavors were
later augmented with pure experimental efforts. The advent of CFD
provided fluid dynamicists with a third dimension that readily
complements those analytical and experimental efforts. In recent
years, what we have been witnessing is a new wave of users and
applications due to significant improvements in the available CFD
software. And these improvements are spurred by the robustness,
usability, efficiency, and (most importantly) accuracy of the
software available to users.

34
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

The increased access to CFD has been felt in three distinct in undergraduate research or senior keystone projects.
categories: research, engineering design, and education. As However, formal structured courses tailored to
the use of CFD in research and engineering design has undergraduate students are beneficial in building a sound
grown, so has the incorporation of CFD in educational foundation for future field practice. It is important that these
settings. A major reason for this growth has been the courses provide elements of practicality. For example, there
constant necessity for a pool of engineers with CFD should be an emphasis on concepts such as establishing the
capabilities and good knowledge of software to respond to flow domain, specifying boundary conditions, creating
the needs of research and engineering design fields. Years suitable meshes, and defining convergence criteria. These
ago, CFD-capable engineers were trained in graduate courses should also highlight errors and uncertainties
schools. However, most of the new CFD engineers are now awareness and best practices to eliminate and reduce them;
trained at the undergraduate level. There are several validation and verification should be a major theme.
reasons for this. First is the availability of academic versions
and licenses of CFD software. Not only are there now CFD-based textbooks are another necessary part of
several Open-source CFD software packages available to responding to curriculum design challenges. Many of the
students, but many commercial vendors now offer free or past CFD books have focused on the algorithm and scheme
low-cost licenses to colleges and universities. development. These textbooks serve as great resources for
developing a solid understanding of the fundamentals of the
A second reason CFD and other simulation tools have science of CFD. They are excellent references for graduate-
become widespread in undergraduate curricula is the level courses and those who endeavor to develop new or
increased access to powerful computers and simulation enhance existing CFD software. However, these traditional
tools. And finally, the third– and most important– reason is CFD textbooks do not meet the needs of students in
that current CFD software packages are significantly more undergraduate courses. The textbook for undergraduate
user-friendly than past versions, with features such as easy- courses and novice users should include how CFD would
to-set-up graphic user interfaces, warnings for improper tackle complex flow problems, including the intricacies of
setup, automated initial mesh suitable for CFD simulation, each step in the CFD process. Importantly, it must provide
and many more. Simultaneously, fluid dynamics and heat examples of how different strategies must be devised for
transfer instructors now reach for CFD simulations to different flow regimes or complex applications. It would be
describe and discuss complex phenomena in their lectures. best, however, if authors of any such work refrain from
Concepts such as vortex shedding, flow separations, shocks, focusing on any specific software, commercial or open-
and expansion are less cumbersome to explain and much source. This is because software packages continually
easier to visualize using CFD postprocessing. evolve and being too specific would soon render the textbook
or the information provided obsolete.
Whilst this all represents progress, there are challenges
looming on the horizon. For example, for most higher
education institutions in the United States, undergraduate
CFD courses are, if offered, technical electives. Therefore, Takeaways
undergraduate students are generally self-taught and often With advances in simulation over the past 10-20 years, CFD
rely on material on the internet. Students’ forums, tutorial use by companies and organizations is rising. This increase
videos, and wiki websites are where students turn to in in use requires a supply of engineers capable of flow
order to gather information or learn more. Therefore, simulation for research, design, and development. These
software vendors must strengthen and expand their free engineers must graduate from universities with a good
educational offerings on the web. These resources will also understanding of the CFD process and practical knowledge
benefit users with commercial licenses, which has the of applying CFD to complex flow phenomena. Readily
added benefit of reducing user support demands. available and openly disseminated resources from CFD
software makers, practical undergraduate CFD courses, and
The role of university programs in increasing the use of CFD updated textbooks are necessary to continually develop the
in education cannot be understated. Many undergraduate required pool of engineers. n
students who learn CFD in these institutions often use CFD

Kamran Fouladi Ph.D., PE. is an Assistant Professor of Mechanical Engineering at Widener University teaching undergraduate
and graduate thermal fluid courses. He is an educator, researcher, and specialist in CFD and thermal management with more
than 25 years of engineering and teaching experience. Kamran is a licensed Professional Engineer in Pennsylvania. He has been
a NAFEMS tutor since 2011, delivering several e-learning courses on CFD each year, as well as providing in-person training.

35
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Cette année nous serons accueillis par le


CETIM dans leurs locaux de Senlis (60) au
nord de Paris, haut lieu de la mécanique en
France les 23 et 24 novembre prochains.

Comme pour nos précédents événements, notre Comité de programme animé par Jean-Marc Crepel a finalisé un
programme attractif avec plus de 70 interventions dans 13 sessions parallèles traitant les disciplines traditionnelles
couvertes par NAFEMS comme:
• les structures mécaniques,
• la CFD,
• la méthodologie simulation et le SPDM,
• les jumeaux numériques,
• la corrélation essais-simulation avec l’ASTE,
• les matériaux et processus de fabrication avec le CETIM,
• l’interopérabilité et les standards avec l’AFNeT,
• la convergence de l’ingénierie des systèmes complexes avec l’AFIS,
• la fabrication additive avec MICADO, etc.
Cette année une attention particulière sera apportée à l’évolution des architectures de calcul et des modes de
travail associés, le green IT et le cloud avec Teratec et le pôle Systématics. L’IA, le data analytics et les applications
quantiques seront également abordés. Comme en 2020, la simulation biomécanique sera largement présente en
partenariat avec MICADO et l’Alliance Avicenna.
Réservez votre place nafems.org/nrc22

Deep Breaths with CFD


Doctors and Aeronautical Engineers at Imperial College are working with Siemens
CFD Software to develop diagnostic methods and improve treatment options for
patients with complex breathing problems. Data from medical CT (computed
tomography) scans enables the engineers to develop patient-specific CFD models for
the airflow in the trachea (wind pipe). One group of patients benefitting from this are
older patients with a goitre – a large growth on the neck caused by an enlarged
thyroid gland. Through periodic monitoring, the CFD models can quantify how
restricted air flow has become due to any narrowing of the trachea from the growing
goitre. The models allow doctors to differentiate between breathing difficulties
caused by the goitre and other loss of lung function due to diseases such as asthma
or Chronic Obstructive Pulmonary Disease (COPD). Better diagnosis allows doctors to
choose the correct treatment path if lung disease is the main cause of breathing
difficulties and may even obviate the need for patients to undergo a risky or
unnecessary goitre resection.

Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Simulation in the Automotive Industry:


Driving Convergence to Electrification, Autonomous and Connectivity

Simulation in the Automotive Industry:


Driving Convergence to Electrification, Autonomous and Connectivity

07 December 2022 Troy MI USA

This event aims to deliver information and

Register
insights on critical topic areas in a manner
that maximizes the “take-away” value for
attendees. An event agenda and concept
championed by several leading figures in
the automotive industry will provide the
opportunity to learn about the latest
technologies and practices, which
attendees can later share and apply
within their own organizations.
Today!
Our Sponsors

Platinum Standard

nafems.org/events
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Katherine Johnson and


the Human Computers
In the middle of this series of shorts where we’re talking about CFD in the real world, we want to celebrate a group of
women whose contributions went unrecognised for so many years because of prejudice. The story of how Katherine
Johnson and a team of African-American women did the calculations of the necessary trajectory from the earth to the
moon for the US Apollo space program has only recently become public knowledge thanks to the book Hidden Figures
and the Hollywood movie of the same title. For many years, ingrained racism and sexism kept the story of their crucial
contributions to the space race in the shadows.

Katherine used Euler's method to calculate that trajectory revealing that she “computed the path that would get you
there. We told them how fast they would be going, and the moon would be there when you got there”.

Bear in mind that whilst Katherine and her team were working to send the human race safely into space, they were
doing so whilst being treated as second-class citizens in a society where the horrific Jim Crow laws mandated racial
segregation. And as if racial discrimination and prejudice weren’t enough, they were also dealing with endemic
sexism in the workplace, paid less than their male counterparts, and referred to as ‘subprofessionals’.

Her daughter, Moore Johnson, described Katherine as “an exciting, quiet thunder who managed to open up the world
of space”. Even before the Apollo mission, Katherine and her team had calculated Alan Shepard’s trajectory for his
brief foray into space in 1961, then calculated and plotted John Glenn’s path in 1962 when he was sent safely into orbit
and back. It is reported that Glenn refused to even start that journey without Katherine verifying the calculations
provided by the machines – perhaps the ultimate in Verification and Validation.

nafe.ms/CFD100-4

Learning from Disaster


to Improve Safety
Dr Andrei Horvat has reproduced the simulation analysis for the 1987 King's Cross Fire
Disaster in London when a flashover fire was responsible for 31 deaths in the escalator shaft
and ticket hall of the London Underground Piccadilly Line.

In a landmark use of CFD, simulations undertaken by Simcox et al. of AERE Harwell were
instrumental in the public inquiry investigation’s finding that a previously unknown ‘trench
effect’ was a significant contributor to the flashover. The trench effect confounded expert
opinion at the time, but was subsequently demonstrated by 1/3 scale fire tests undertaken by
HSE Buxton as part of the same public inquiry thereby validating the CFD simulation results.

For Andrei’s take on this, see: nafe.ms/CFD100-5

nafe.ms/CFD100-6

Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Vendor Viewpoint
f CFD is to continue its evolution, the people behind the software need to have a

I clear vision of how the technology will develop. NAFEMS prides itself on being
fiercely code-independent – that’s a strength rather than a weakness. As opposed
to keeping the software vendor community at arms-length, we welcome each and
every company involved in developing the software, on an equal footing, so that the
discussion around the technology features every voice not just one or two.

So, we asked our vendor network to give us their thoughts on where CFD is heading;
they did not disappoint. Here, we have a fascinating insight into how the CFD software
community sees the next 10, 20, and even 50-100 years playing out, from their own
unique perspectives.

Flow Science
Michael Barkhudarov & C.W. Hirt

Cadence
John Chawner

ANSYS
Dipankar Choudhury

Siemens Digital Industries Software


Simon Fischer

Particleworks
Massimo Galbiati

ESTECO
Enrico Nobile

SIMULIA
Dean Palfreyman

EnginSoft UK LTD
Bipin Patel

39
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

NAFEMS
...
How do you see CFD evolving over the next 20 years?

John Chawner | Cadence


The future of CFD is something that’s been on my mind a lot since 2015 because of my involvement with
the CFD Vision 2030. This Vision was originally published by NASA in 2014 and describes how CFD will
have to evolve in order to achieve broad and ambitious goals for aerospace systems. Since the Vision’s
publication, a community of advocacy, monitoring, and practice has been established, of which I am a
part. Although 2030 is only 8 years in the future, some of the aerospace community's goals relative to the
Vision extended to 2040.

Certainly, we all recognize today that CFD and other physics-based simulation techniques have
fundamentally and positively changed the design process. The future of CFD is much less about the
individual technology winners and losers and more about its impact on how systems are designed,
manufactured, and maintained. Even today, it’s recognized that physics-based simulation can address
innovation and quality but also reduced time to market, cost, and risk. Based on its current trajectory,
CFD will expand its range of applicability improving its ability to handle turbulent separated flows around
complex geometries while doing so in a timely manner relative to the design environment.

I don’t think there’s any magic here. Production CFD software will become more robust, faster, and more
accurate. Advanced CFD methods will start coming online to expand the boundaries of what we can do.

Like • Reply

Massimo Galbiati | Particleworks


For the last 20 years, I have seen an impressive growth of CFD in terms of what we can simulate, process
simplification, accuracy, and speed. At the same time, I have seen two important aspects, the first and
most important is the growth in industry awareness of the benefits of CFD. The second is the birth of
new methods, above all, meshless methods. These methods were basically unknown up until 5-6 years
ago, more or less like Finite Volume CFD 25 years ago. The interesting part of this is that the adoption of
new meshless methods, like the Moving Particle Simulation, will be much faster, thanks to the higher
awareness across all the industrial sectors of the value of CFD. For the next 20 years, I see a fast growth
of meshless CFD. For the same reason, Finite Volume CFD will have to keep up the pace by further
simplifying and automating the meshing process. Life will be easier for CFD engineers with the capability
to go from CAD to results in a very short time. Not only will simulations run much faster but, most
importantly, setting up models will be immediate.

Like • Reply

40
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Vendor Viewpoint

Enrico Nobile | ESTECO


Mesh-based methods will continue to play a relevant, or major, role in the near future. Although many
popular open-source or commercial platforms are, at this time, still based on methodologies and
numerical schemes which date back to more than 25 years ago, they are/will be undergoing some
constant improvement/update, i.e., Adaptive Mesh Refinement (AMR), high-order methods, curved cells,
to name a few.

I do expect, however, that meshless methods will start to gain more and more popularity, in particular in
those areas of application where they have already proved their capabilities and, even more importantly,
the reduced effort/time required to the user (e.g., no meshing required).

Turbulence
Due to the continuous increase in computational performance and capabilities of hardware– from
workstations to departmental servers up to the most powerful HPCs– Scale-Resolving Simulations
(SRS), e.g., DES, DDES, and SAS, will become more and more popular, in particular for those applications
where standard RANS (Reynolds Averaged Navier Stokes) simulations have shown their intrinsic
limitations, e.g., highly separated/unsteady flows at moderate to not-too-high Reynolds number.

Additionally, Machine Learning /Artificial Intelligence (ML/AI) will be used in order to enhance modelling
accuracy using data-assimilation techniques. This will lead to application- or user-specific models that
combine canonical turbulence modelling with ML/AI trained on specific user data.

Real-time CFD
There are already some lower-fidelity 3D CFD solvers that are able to (almost) run in real-time, thanks to
the choice of proper algorithms and data structure and dedicated (GPUs) hardware. This trend will
continue, with more solvers/platforms capable of running in real-time expected to become available.
Many high-fidelity, accurate CFD simulations will still require overnight (or even more) runs.

Which technology will prevail? FV vs FE vs SPH vs LBM…


I do expect that FE, and in particular DG-FE (Discontinuous Galerkin Finite Element), will gain popularity,
in particular for SRS simulations, although FV will be improved by including a high-order option for the
same type of application areas. Multi-fidelity, cloud-based, solvers/platforms will also become more
popular.

Democratisation
Application-specific wrappers will be generated in order to facilitate end-users, either through public or
private clouds, to launch and interpret CFD calculations.

Machine Learning/Artificial Intelligence


Machine Learning will be the dominant driver to recover accuracy in many CFD workflows. In other
words, it will be the enabling technology to broaden the adoption of multi-fidelity techniques.

Like • Reply

41
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Dean Palfreyman | SIMULIA


The rapid development of computer hardware and new digital technologies have led to enormous
advancements in Computational Fluid Dynamics (CFD) over the past two decades. As we look to the next
20 years, several issues will underpin the continued proliferation of CFD in the simulation market.

The first is with well-established industries like Transportation & Mobility, Aerospace & Defense, and
High-Tech. Under the framework of the digital twin, the continued drive toward virtual certification of
products will lead to the need for more advanced technologies and higher computing resources to reach
the level of accuracy necessary to certify products virtually. This will help reduce expensive physical
tests and provide greater insight into product behavior.

We have seen this with our PowerFLOW suite, where customers have dramatically reduced wind tunnel
testing on vehicles due to the high accuracy achieved with PowerFLOW. Continued development of
numerical methods for high-fidelity simulations will remain a core focus of every CFD development team.
In the near term, emerging computing resources like GPUs will enable faster “real-time” simulations and
longer-term computing methods like quantum computing to accelerate simulation speeds dramatically.
Continued automation of simulation model preparation and meshing techniques will enable more
complex geometric assemblies to be simulated at the same time and provide greater insight into the
interplay between different parts of the assembly and its subsystems.

The second is the proliferation of CFD into other industries and users who have not been classically
trained in CFD numerics and methods. It is beholden upon every software development organization to
“internalize” the complexity of all aspects of the simulation processes to reach new users and set up
simulations specific to their intended scenario. We don’t believe a strategy of simplifying the physics is
the right approach since all types of users will want to perform sophisticated simulations at a level of
accuracy that impacts the design process. Turnaround time is vital, and new hardware like GPUs to
massively accelerate the compute time is critical. Providing near real-time simulation results will enable
designers to quickly assess their product and make informed changes accordingly, but the results must
be sufficiently accurate and robust to be trusted by the user and effective in the design process.

The third is the linking of different physics and scales. This applies both to the basic exchange of
physics data to other simulation disciplines and to the orchestration of sophisticated multiphysics
simulations to capture more realistic behavior. We don’t believe users in 20 years will think along
traditional lines defined by different physics; instead, they will want to focus on the scenario to be
simulated. All CFD software providers need to provide the ability to seamlessly and robustly link physics
models together.

One final issue is data. With the expanded use of CFD, particularly linking to optimization methods where
thousands of scenarios can be simulated, the ability to data-mine results and look for insights and trends
will be vital to maximizing the impact of the simulations on the design. Emerging technology like
Machine Learning will help users provide this deep insight.

Like • Reply

Hunting Down COVID


As we now know, COVID-19 is spread by respiratory aerosols and
droplets which can result in airborne transmission. As users of CFD tools,
many of us have considered or been asked what simulations we can do to
support the fight against the virus and protect people against infection. One of the
benefits of CFD is that it makes visible what we would otherwise not see, but it is also
vital to know how well the simulations capture what is important. Validation is challenging,
boundary conditions may not be clear, and what about factors like the transient effects of breathing and the
motion of people? Further important questions exist around how to treat turbulence, which simplifying
assumptions to use, and how all this relates to viral load.

All of this was discussed at length in the NAFEMS CFD Community panel event in December 2020. You can replay
the event, view the slides, read the related summary article, and indeed join the CFD community, at the link below.
nafe.ms/CFD100-7

Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Vendor Viewpoint

Michael Barkhudarov & C.W. Hirt | Flow Science


Turbulence
Turbulence will remain the single most challenging part of CFD. Focus will be on extending it to a wider
range of time and length scales, accounting for anisotropy in flow and material properties, and on multi-
phase flows.

Real-Time CFD
Large, long, detailed simulations will remain important, at least as a niche, for cases where simulation
time is less important than accuracy and insight. Experience shows that the complexity of simulations
consistently outruns the advances in performance. At the same time, meteorological and climate
simulations, which use some of the most complex numerical and mathematical models, have been
running faster than real time for decades already. Nevertheless, the ‘overnight’ CFD tools will proliferate
in the future, tailored to specific applications and processes, where simplifications are possible.

An extreme case of such simplified models is that of ‘digital twins’ that can produce fast and reliable
solutions for small components of a larger process, to provide real-time control and adjustments to the
process. By necessity, these tools must capture only a small, mathematically well-defined part of the
whole.

Which technology will prevail? FV vs FE vs SPH vs LBM…


Only if they build hardware designed for a specific method, will that method prevail over the others.
Otherwise, it will remain the same – some good here, some good there. The unifying factors in all these
methods are the equations being solved and the requirement for accuracy. One could also envision
hybrid approaches, where multiple representations of the physical objects coexist and interact within the
same domain. There are already methods that combine Eulerian and Lagrangian approaches for fluid-
particle interaction, or adjacent FV and FE domains for fluid-structure interaction.

Machine Learning/Artificial Intelligence


• Artificial Intelligence has been suggested as an advance that will greatly improve CFD modeling.
This is not clear. AI rests on the evaluation of many simulations and what might be learned from
them. However, the choice of examples to include in an evaluation is critical and how can one be
assured that all possible physical features are included in the sampling? A real difficulty with AI is
that one cannot know what has and has not been included. The outcome cannot be properly
evaluated.

• Machine Learning sounds promising, but hard to debug/trust and question the results in a
meaningful way. Only used in a very narrow, well-tested and trained area, which by definition limits
the usefulness of AI as a research tool.

Like • Reply

43
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Simon Fischer | Siemens


Meshfree CFD codes offer an appealing alternative approach to mesh-based methods for selected
applications. When rapidly getting results is a priority over the highest accuracy, Smoothed Particle
Hydrodynamics (SPH) is and will remain an efficient tool. However, there will not be a silver bullet CFD
method. Even more than today, the future will consist of a blended, hybrid coexistence of mesh-based
and meshless approaches. Automation of meshing and the generation of high-quality meshes with
minimum user effort will continue to evolve. Technologies like model-driven adaptive mesh refinement
will further develop and allow to meet required accuracy levels with minimum user intervention. The
same is valid for particle-based SPH methods, where automated and intelligent resolution refinement will
become an established methodology. AI-driven mesh generation may be another future topic. Clever
approaches to change between mesh-based and mesh-free treatment might become a reality as CFD
codes absorb all technologies into one piece of code.

And yet there will not be a one-size-fits-all CFD meshing solution but an automated, intelligent, and
adaptive choice of the most suitable method for a given problem. In all cases, any suitable approach and
meshing technology will be faster than today’s solutions. Ensuring high fidelity while reducing
engineering time for simulation set-up and hardware configuration and access is an important driver for
the continuous investment in CAE solutions at Siemens. This will enable CFD engineers to model the
complexity of today’s products with adequate meshing technology while staying integrated into a single
CFD simulation environment.

Turbulence
Like for meshing or meshfree approaches, it will be no different from today, there will not be a single
solution in turbulence modeling. Turbulence modeling will still be based mainly on a combination of
statistical modeling Reynolds Average Navier-Stokes (RANS) and scale-resolving simulation (SRS), like
large-eddy simulation (LES) or Detached-eddy simulations (DES). Some improvements in both methods
will likely come from a combination of physical assumptions and machine learning (ML). The major
challenge will remain the same; accurately predicting unsteady turbulent flows with the determination of
separation. And as RANS alone will be unlikely to cope with this challenge in a fully predictive way, while
LES remains rather computationally expensive, a difference compared to today will be the way the two
approaches are combined; either in a segregated manner, where they are the most appropriate or in
conjunction, where the local solution of the SRS models will inform, locally, the statistical model (e.g., via
Machine Learning).

As engineers increasingly leverage such AI-based model choices in CFD simulation, they must have a
sound understanding of those models and clear visibility of which model a simulation uses, when, and
where. Despite the many benefits AI may bring, due to the nonlinear nature of turbulence, this field will
require critical judgment through engineering or researcher experience and strict validation.

With continuously increasing computational power, high fidelity models, LES, or Direct Numerical
Simulation (DNS) will further get into industry space where appropriate and dominate most of the
research in turbulence. However, engineers will continue using the most efficient solution to answer the
engineering question. Hence RANS will remain the state-of-the-art industrial CFD approach to go faster
unless results prove it has hit its limits to solve a given turbulent flow problem.

Real-time CFD
Going faster is the precondition to exploring the possibilities to find innovative flow solutions and
modeling the complexity of products with the required high fidelity for predictive engineering.

While real-time CFD is the ultimate ideal, overnight runs will still play their role in this high-fidelity design
space exploration engineering for some time. Compared to today, such simulations’ sophistication level
will increase as engineers create even more comprehensive multidisciplinary digital twins in response to
more complex product requirements.

At the same time, as the industrial metaverse evolves, the need for rapid, real-time fluid dynamics
predictions will become a new application class. AI and low-fidelity physics approaches will play a
significant role in this ambitious goal, just like new hardware: Quantum computing would be the ultimate
disruptor in that respect. Until then the continued exploration of intelligent speed-up technologies and
leveraging next-generation CPU (Central Processing Unit) based HPC (High-Performance Computing),
GPU (Graphics Processing Unit), ARM, and low-latency cache technology, will be of paramount
importance to get closer to real-time fluid dynamics simulation.

44
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Vendor Viewpoint

To summarise, neither for meshing/modeling nor for the general technology of fluid mechanics predictions
will there be a single methodology to serve all needs. Like today, but on a more diversified range, there will
be a spectrum of methods to choose the most appropriate compromise between required fidelity and
response time. Methodologies may range from “traditional” high-fidelity CFD methods, through rapid
meshfree simulations like SPH, to AI-supported fluid dynamics prediction. For simulations, AI may be a
supportive force in modeling choices, but engineering expertise and judgment of CFD results will remain
essential. However, with the continuous embedding of this breadth of CFD technologies into unified
platforms/code environments, engineers can stay integrated while making the best possible simulation
choices and going faster.

Which technology will prevail? FV vs FE vs SPH vs LBM…


They will all survive and coexist in a more integrated fashion, each offering a valuable solution for a specific
engineering challenge

Accessibility
The offering and usage of cloud-based CFD solutions will massively increase. With the flexibility and
scalability of SaaS business models, companies of all sizes and individuals already today have instant
access to the exact amount of hardware and software needed at a given time for their specific CFD project.
Pre- and post-processing will move away from workstations and into a web browser. The recently released
Simcenter Cloud HPC by Siemens allows you to run your CFD simulations on optimized hardware
configured and managed by Siemens, using the underlying Amazon Web Services (AWS) infrastructure.
Moving a simulation to the cloud takes as little as two clicks of the mouse, without leaving the Simcenter
STAR-CCM+ user interface, and jobs can be monitored through a web browser on any device. Pre-paid
credits cover the cost of hardware, software, data transfer, and storage. This results in a single pay-as-you-
go charge for each simulation run which greatly simplifies budget management for businesses of all sizes.
This trend will continue, cloud technology will further evolve, and such offerings will become an increasingly
important strategy for companies needing flexible access to CFD.

Democratisation
The barriers to high-fidelity CFD will further decrease on all levels. Cloud-based offerings (see above) will
grant anyone with some device and browser access to CFD software and the required hardware. Modern
CFD solutions will leverage client-server technology with its decoupled back end (solver, physics) and front
end (GUI, pre- and post-processing, automation). Such architectures enable the independent creation of
dedicated, tailormade app-like front ends for specific applications using low-code approach, opening up
CFD to new end-user bases. CFD experts will be able to deploy these app-like front ends by incorporating
and maintaining best practices with minimum effort. To further boost democratization AI embedded into
graphical user interfaces is already starting to simplify setup procedures and mitigate errors. AI-assisted
model preparation, like CAD/geometry preparation through intelligent part recognition, is becoming the
norm because it significantly speeds up and simplifies the CFD setup procedure. Novel input and output
device technology (touch pads, Virtual Reality (VR), Augmented Reality (AR)) will make it easier to both set
up and explore simulation results.

All these User Experience (UX) enhancements will further open up the usage of CFD to non-CFD analysts on
a much broader scale and more regular basis. An increasing number of designers and application-focused
(non-CFD expert) engineers will leverage predefined simulation methodologies. The industrial metaverse
may add another whole new user class for fluid mechanics simulation, further extending the audience and
user base for CFD simulation to an even younger generation on a broader educational level than just
engineering or research.

Digital twins
Digital twins offer the ability to replicate the real world realistically. They are a vital technology for the
metaverse where the real and digital worlds will merge almost seamlessly. Not only in the industrial
metaverse can we expect that CFD plays a significant role in the creation of holistic digital twins. With the
increasing breadth of simulation (not just CFD) methodologies, the digital twin of a product will consist of a
range of models to represent the product in various stages of fidelity, from real-time-ready representations
for the metaverse to very accurate representations for close-to-production and validation purposes. As
such, managing digital twins with their instances of different levels of realism through Product Lifecycle
Management (PLM) solutions becomes increasingly important. Engineers can leverage adequate model
fidelity for a given engineering challenge – this may even be a process supported by AI.

Like • Reply

45
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

The overall fidelity of a digital twin will further increase by modeling the complexity of today’s
products across engineering disciplines in an integrated manner. CFD will be further embedded into
a seamless ecosystem of multidisciplinary and multiscale simulations. Thanks to increasing
computing power and novel simulation methodologies, real-time digital twins will become a
significant game changer in unforeseen ways – even for comparably complex fluid dynamics
problems.

Machine Learning/Artificial Intelligence


AI has and will continue to transform technologies in every industry. Applying AI to CFD applications
will become an increasingly strategic asset to companies, as it can help reduce costs and create
new differentiated values. AI-driven innovative solutions offer substantial benefits to CFD engineers,
designers, and analysts.

It can reduce computational, design program, and operational costs by offering the ability to assess
more designs per simulation at a faster turnaround time. Furthermore, AI allows the reduction of
process and program development turnaround time with ML-based surrogate models and intelligent
AI-driven workflows to expedite turnaround time. It can enhance the accuracy of simulations by
flagging anomalies and providing knowledgebase workflow assistance in the CFD process. This
includes CAD, physics modeling, mesh settings, and post-processing. In conjunction with CFD, AI
will improve product performance and efficiency by creating an ecosystem to simulate, predict, and
seamlessly optimize products. Providing knowledgebase workflow assistance, AI will further lower
the barrier of setting up meaningful CFD simulations faster.

But despite those benefits potentially offered by ML/AI, well-educated engineers will remain a
critical factor in method development and assessment. In that sense, considering AI as the solution
to every fluid dynamics problem is naïve. The nonlinear nature of fluid mechanics, well captured by
Navier-stokes equations and established turbulence models, will pose a challenge to AI. On top of
assessing data quality, appropriate data selection and validation to train AI and machine learning
algorithms require engineering and AI knowledge. Delivering AI capabilities in CFD designs and
simulations requires talent in machine learning, deep learning techniques, and CFD skills.

Hence, for the successful adoption of AI in CFD, it will be of paramount importance that software
vendors deliver integrated solutions and consulting services for CFD and AI, something that
Siemens is already doing and continues to invest heavily in.

General purpose codes vs application specific codes?


There is no one-size fits all solution, but the trend of functional integration to cope with specific
applications in general-purpose tools will continue. For that, general-purpose solvers will continue to
absorb simulation technology into one CFD framework or offer seamless open interfaces for co-
simulation with specific standalone solvers. Ultimately, application-specific technology in general
purpose tools will become more and more directly accessible thanks to the increasing ability to
create low-code apps for tailored application-specific front-ends.

Like • Reply

46
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Vendor Viewpoint

Dipankar Choudhury | ANSYS


Over the past two decades, CFD solution methods, physical models, usability, and HPC scalability have
advanced dramatically. With those advances, the value CFD brings to users has increased many-fold –
also thanks to the simultaneous advances in computing hardware power and speed. Most recently, GP-
GPU hardware has matured and is now the springboard for CFD to take the next leap forward.

We are already seeing this today, in applications like Ansys Discovery, which provides users with near-
instantaneous insight into their evolving fluid flow simulations thanks to the pervasive use of GPU
hardware in all steps of the simulation. Extending such a fully GPU-native CFD solver to large clusters
with distributed memory is next, and will open doors to types of simulations that were previously
impractical in most industrial applications – in particular, inherently transient scale-resolved simulations.
We can expect such scale-resolved simulations to become the norm rather than the exception in the
coming years and decades, bringing common industrial CFD applications to the next level of fidelity. And
with the continued evolution of hardware, even such highly computationally-intensive scale-resolved CFD
simulations can become sufficiently fast to allow users to see their flow simulations evolve and adapt to
changes ‘live’ and even interactively.

While second-order finite volume methods have been repeatedly challenged, they can be expected to
remain the workhorse for general industrial CFD across a wide range of applications, thanks to their
geometric flexibility, combined with high speed and parallel scalability. These methods also offer a highly
intuitive framework for physics modeling across all applications, Mach, and Reynolds numbers,
compared to other methods like LB, which can be expected to remain a niche technology due to its
limitation to Cartesian lattices. Higher order methods can however play an increasing role in the
development of efficient and robust implicit formulations suited for specific applications like highly
resolved LES calculations.

Democratization of CFD has taken big steps in recent years with the latest generation of highly
accessible, highly usable, easy-to-learn, and easy-to-remember user experience in software like Ansys
Discovery that has focused on a larger swathe of the potential customer base such as design engineers
and users new to CFD. The latest advances in immersive user interfaces have been combined with a high
degree of automation of workflows and automated mesh generation methods. These advances have
also benefited flagship general purpose products such as Ansys Fluent. Application-specific CFD tools
such as Ansys Icepak (for electronics cooling) also expand the accessibility of CFD; however, it is
important to note that the best-in-class application-specific tools such as Icepak are special-purpose use
of general-purpose CFD engines such as Fluent. The advantage of this approach is that the full power,
performance, and accuracy of the general-purpose engines are available and used in the context of the
application without compromise. The availability of cloud computing and advanced HPC computers on
the cloud will allow far more customers to access CFD computing on demand.

Finally, very significant advances in reduced order methods, hybrid analytics, machine learning,
component, and platform technology, and coupling of these with industry-standard IoT platforms have
allowed very advanced, accurate, and broadly usable digital twins to become practical in CFD. Examples
of mainstream use of digital twins can be found in the industry with Ansys Digital Twin software. Aside
from being an innovative method in the creation and use of Digital Twins, machine learning is now
starting to find novel uses in CFD including performance and resource predictions, solver steering and
tuning, and refinement of mathematical models such as turbulence modeling and even new solver
classes. The use of machine learning in CFD simulation is expected to grow very significantly in the
coming years.

Like • Reply

47
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

NAFEMS
...
What will the limitations of CFD be in 20 years?
What will be the barriers to CFD and its development?

Massimo Galbiati | Particleworks


I do not see big technological barriers. The need for simulation, and the development and affordability of
hardware resources will facilitate the development and growth of CFD codes of different types. I only see
cultural barriers to the adoption of new CFD methods. Nowadays CFD experts are sometimes prone to
thinking that Finite Volume CFD is “the only CFD” and are reluctant to switch to meshless methods. In the
coming years, CFD experts will understand that it is normal and beneficial to adopt different CFD methods
depending on the specific application. For example, the study of transmission lubrication or e-axles cooling
can be easily and accurately simulated with Moving Particle Simulation (MPS), whereas it is very complex,
time-consuming, and even numerically unstable when done using Finite Volume CFD.

Like • Reply

Michael Barkhudarov & C.W. Hirt | Flow Science


There seems to be no way, in general, to avoid the use of advancement in small time steps for every element
to have a time-dependent solution of the dynamics. Some simplifications can be made for special cases, but
the basis of incremental time advancement cannot be avoided.

There are two views, or choices of reference frame, used to advance the equations of motion. One is the
Lagrangian method in which the fluid elements move with the fluid, while the other is the Eulerian method in
which the grid, for example, remains fixed in space and the fluid is moved through it. Historically, only the
Lagrangian method was used in the early days of computing to study converging and expanding spherical
fluids associated with explosions being developed by the Manhattan Project during WWII. Simple, one-
dimensional, Lagrangian models were also used to investigate shock interactions passing through layers of
different materials. Because of the limited memory and speeds of the earliest computers, these models were
typically confined to a small number of fluid elements.

The extension of the Lagrangian models to more dimensions raises some difficult problems. Foremost is the
fact that such things as grid elements do not maintain their shape as they move, for example, they typically
undergo shears that distort them so much that they can no longer be used for accurate numerical
approximations. Overcoming this requires resorting to some sort of rezoning of the grid, and that means
introducing some sort of averaging process to convert between old and new grid shapes.

Averaging always introduces some smoothing and so may reduce the loss of fine scale details, something
that is difficult to avoid. Many researchers have proposed a wide variety of averaging methods attempting to
reduce the smoothing process and, in some cases, have obtained improved results. However, there is no
perfect answer because deciding what distribution material might be in a grid element is unknown. The
amounts of material may be known, but how it might be distributed is not. Thus, subdividing the material for
a new rezoned distribution cannot be perfect.

Incompressible flows have been a great success, but they are not without their limitations. A simple example
will illustrate one of the difficulties of CFD that needs more attention; the collapse of a steam bubble in a
pool of water associated with steam suppression in light-water nuclear reactors. The injection of the steam
bubble is slow enough that the water can be treated as incompressible, but as the steam condenses, the
bubble is collapsing. At the instant of collapse, all the water rushing to fill the bubble space must instantly
be stopped. This requires a very large pressure pulse to terminate the flow, one that is much larger than
experimentally observed. The problem is that the final collapse happens over a small time- interval and the
assumption of incompressibility in the fluid is not satisfied. For this case, some compressibility must be
allowed for the pressure to propagate out a finite distance that only stops the fluid out to the distance the
pressure wave has travelled. This complicates the numerical solution but is necessary for physical accuracy.
Importantly, it illustrates the need for considerable caution in developing numerical models. Effort must be
made to prepare for exceptions and the possible need for the addition of more physical processes to general
model development.

48 Like • Reply

Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Vendor Viewpoint

Simon Fischer | Siemens


Methods
From a development perspective, ensuring a sustainable CFD framework is necessary to keep a code
alive for a maximum period. Modern architecture is a precondition, just like project-oriented
programming. The demands of handling code complexity rise as more multidisciplinary simulation
technologies and solver capabilities get absorbed into unified code platforms offered on the cloud. All
this requires proper development tools and strategies, product management skills, and continuous
intelligent investments into software development.

A new challenge results from setups with the help of AI/ML as automated modeling choices pose a risk
of becoming opaque to the end user. Hence, modeling approach changes on the fly need to be reported,
allowing engineers to assess AI choices and, ultimately CFD results. This re-emphasizes the
fundamental importance of engineering and CFD know-how despite or because of AI progress.

Another challenge comes with the amount of data that is being created as more and more simulations
can be run in the same amount of time. The capability to handle and investigate these vast amounts of
datasets stemming from a wide range of sources is of paramount importance to not limit the value of
CFD in making the best possible engineering decisions. Hence both PLM and data postprocessing
systems must be prepared for big data management and to support engineers in their analyses.
Parallelization of pre- and postprocessing is hence equally important to the parallelization of the
calculations themselves.

Physics/multiphysics
The demand for higher fidelity in the digital twin results in multiscale and multidisciplinary modeling
requirements. Models need to become more accurate, more detailed, and interconnected. Quantum,
molecular, micro, meso, and macro scales need to be considered, and yet at the same time, those
complex models should run faster. This will be a continuous challenge asking for clever techniques like,
e.g., hybrid multiphase modeling.

Computer hardware/resource
Generally, the introduction of new hardware architectures and technologies may require the adjustment
of CFD algorithms. A light version of this transition is the current porting of solvers from CPU to GPU. As
supercomputers may consist of more and more heterogeneous architectures, combing multicore GPU
and CPU, with shared memory, hierarchical networks, etc., modern CFD codes will need to be prepared for
a multitude of architectures, with algorithms and architectures that are leveraging the best of each
world.

But this is minor compared to what development teams will be facing when the disruption from von
Neumann to Quantum Computing (QC) occurs. In principle, quantum computers are tailor-made for
simulation in many respects. The potential is enormous. However, introducing QC will mark a true
disruption for any CFD code as it will require a completely new algorithm implementation. In QC, nothing
will be portable from traditional von Neumann architectures. In other words, QC requires a complete code
rewrite of every current CFD software. To add to the challenge, while powerful and applicable in many
CFD simulation areas, QC is not useful at all for other CFD-related tasks, while traditional von Neumann
architecture is. So, the only efficient solution is a hybrid solution. This, in return, will require developers
that can cope with either architecture. To date, the classic path to CFD code development is mostly that
someone with sound physics understanding comes in and learns the programming part. And while
object-oriented programming is comparatively easy to learn, this will no longer be true for QC algorithm
implementation. As there is no compiler assistance in QC, it requires developers that can write QC-able
code requiring deep knowledge of such algorithms.

Like • Reply

49
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Another aspect that will become more and more important is the power consumption of computing
hardware. As the trend to leverage massive compute power for CFD continues, costs associated with
energy consumption become a significant factor. Hence, new energy-efficient technologies like reduced
instruction set computers (RISC) play an increasing role in the positioning of time-to-solution vs. energy-
to-solution to achieve optimum cost-to-solution.

Engineers / human resource


Educating future engineers is a crucial challenge. All other challenges can be overcome or minimized if
the educational system provides industry and research with the next generation of CFD engineers. But
those engineers can no longer be purely CFD analysts in the traditional sense. Novel manufacturing
technologies (like AM and 3D printing) and closer connectivity between CAE-and -CAD departments are
necessary for companies to stay competitive. Hence the profile of CFD engineers needs to include some
understanding of manufacturing and design processes, just like designers need to start embracing
simulation technologies. As AI penetrates the fluid dynamics engineering domain, the future CFD
engineer needs a thorough understanding to make the right decisions on the most efficient technology
for a given problem. The progress of CFD simulation heavily relies on this factor.

Generally, it is a society and industry challenge to ensure CFD engineering and computer science is an
(even more) attractive career path for the next generations. Talented and well-educated engineers are
crucial for humanity to cope with the challenges of climate change and its consequences.

Like • Reply

Dipankar Choudhury | ANSYS


We are far off DNS (Direct Numerical Simulation) resolution in today’s simulations. For every application,
there will be a level of resolution/modeling that is considered sufficient for engineering purposes. Some
will be carried out with RANS (Reynolds Averaged Navier-Stokes) models in a few minutes, others will
require ever-increasing compute power to get satisfactory results. The wide spectrum of run times will
therefore remain even with increasing compute power. A true game changer could be Quantum
Computing, but no reliable projections as to its impact on CFD can yet be made.

Complex multiphysics modeling, from a precision of mathematical modeling and resource usage point of
view, will remain a challenge, particularly in the context of multi-disciplinary optimization (MDO).

The evolution of energy costs could also play a more significant role going forward: while compute
power will become cheaper, dramatically rising power costs would put a lid on the balance between cost
and accuracy. That balancing act might force more simulations away from scale-resolved simulations
using high-performance computing, and towards lower fidelity methods like RANS and even Reduced
Order Modeling. This would then reinforce the continued strong need for low-fidelity modeling. The
introduction of GP-GPU computing in HPC has led to very significant power and cost savings; however, it
is a well-known mantra in CFD that when more cost and power-efficient resources are available, problem
sizes grow to fully stretch the available resources!

Like • Reply

50
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Vendor Viewpoint

John Chawner | Cadence


There are the things we know and the things we don’t. We know that the physics will need to be improved
which means better modeling of turbulence, combustion, reactions, icing, etc. That means we are able to
move our research in that direction today. We don’t know exactly what the future computing platforms
will look like, meaning that we have to start implementing our software in a flexible way that allows us to
move to new architectures more efficiently than we do now.

Like • Reply

Enrico Nobile | ESTECO


Software development and maintenance will be a major bottleneck: developers are hard to find.
Furthermore, heterogeneity of upcoming hardware introduces a new level of code complexity, especially
for solvers with a multitude of models and numerical methods.

Like • Reply

Dean Palfreyman | SIMULIA


Several significant hurdles must be overcome as we look to where CFD will be in 20 years.

Simulation model preparation/meshing is a major bottleneck in the development of a CFD simulation,


particularly since the trend is for the inclusion of more complex and expansive geometry … think large
assemblies like full aircraft, full vehicles, highly complex electronic assemblies, etc. Today’s process is
highly manual and repetitive.

Bringing together best-in-class simulation technologies like RANS and LBM and other emerging
numerical methods into one workflow and allowing users to select whichever technology for a given
simulation problem is a more near-term challenge but essential to exploit the benefits for a broader class
of problems. The same is true for new compute hardware like GPUs in the near term and more exotic
hardware like quantum computers in the longer term. We foresee a massive escalation of computing
resources applied to CFD.

A primary barrier will continue to be how to accurately and efficiently handle turbulence modeling. This
has been an area of active research for more than 50 years. The trend is to rely less on modeling and
more on resolving turbulent scales because improved turbulence models have not yet materialized.

Harnessing data analytic technologies like machine learning to help automate processes and guide
users accordingly will be a challenge in ensuring the algorithm(s) “learn” sufficiently to be effectively and
robustly deployed for various workflows. The ability to handle large datasets and synthesize data will be
another challenge; imagine running 20,000 simulations on 50 million elements overnight, how to make
effective use of this data?

A key challenge today is sharing physics results from other disciplines – displacement, thermal loads,
etc. Engineers spend unnecessary time finding data and then using scripts to import and apply the data
to their simulation. Furthermore, the ability to link multiphysics across multiscales hasn’t been fully
addressed from a broader workflow perspective.

Like • Reply

51
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

NAFEMS
...
Which single technology will have the biggest impact on
CFD in the next 20 years?

Simon Fischer | Siemens


It is difficult to name just one. It will be a mix and the synergetic interaction of various technologies.
Cloud technology and the integration of CFD into the industrial metaverse will change how CFD is used
and leveraged. Related, further democratization and increased accessibility will significantly impact how
CFD software evolves and is used to solve engineering problems. Along with this, we will see a growing
infrastructure for real-time co-creation and collaboration in the engineering design space. This will
further spark innovation through yet unseen abilities of cross-pollination between international and
interdisciplinary teams.

AI and ML will be another game-changer.

Additive Manufacturing (AM) has just opened up new ways of bringing product designs to life that are no
longer constrained by traditional manufacturing methods. Leveraging CFD-based topology optimization,
engineers can explore the possibilities for designs of uncompromised high-performance flow solutions.
While their production had been entirely unthinkable in the past, AM and 3D printing are opening the door
for this new engineering era. AM and topology optimization make the flow the actual product designer,
putting the traditional design- simulate- refine loop upside down.

The ultimate disruption will, however, result from the successful introduction of Quantum Computing into
the CFD simulation space. Going faster by orders of magnitude with CFD simulation will change the pace
of innovation in yet unseen ways.

Like • Reply

Dipankar Choudhury | ANSYS


In addition to the game-changing nature of hardware evolution, such as with GPU-based computing, a
much greater degree of automation in the entire simulation process will have an enormous impact on
CFD (and CAE in general). Expert knowledge will be freed of ‘manual labor ’ and can be invested in
creative thinking and design. Full automation will be achieved not only by further advancing geometry
and meshing tasks, including advanced adaption algorithms, but also by Machine Learning-based
methods for decision-making during the simulation set-up, execution, and post-processing. Automatic
optimization, including topology optimization, will increase the design space and define new geometries
and result in increased efficiency of designs.

Like • Reply

Massimo Galbiati | Particleworks


I think mesh-less approaches like Moving Particle Simulation (MPS) will have a big impact on what you
can simulate, in terms of how simple and fast you can do it. This will accelerate the development
process in different industrial sectors, starting from the development of electric powertrains. Software
like Particleworks can significantly reduce the setup and simulation time by one order of magnitude, for
example, in applications related to e-motor cooling. Industrial players will greatly benefit from
implementing this kind of method in their design process.

Like • Reply

52
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

The Piper
Alpha Disaster
The catastrophic loss in the North Sea of the Piper Alpha oil platform,
120 miles off the coast of Aberdeen, Scotland, claimed the lives of 167
people. On 6 July 1988, a small explosion caused secondary damage that
resulted in a second larger explosion and then a sustained major fire.
Lord Cullen led the subsequent public inquiry into the tragedy, and his
report outlined 106 recommendations for changes to safety procedures
in the UK sector of the North Sea. In the aftermath of Piper Alpha and
the Cullen Inquiry, a new focus was placed on modelling approaches for
predicting fire and explosion events, including the use of computational
fluid dynamics for mitigating the risk of fire and explosion damage. The
lessons learned following the disaster, through validation of numerical
tools via experimental programmes, have greatly improved our
understanding of explosions and mitigation methods, which has helped
to make our offshore facilities safer places to live and work.

nafe.ms/CFD100-8

Enrico Nobile | ESTECO


In my opinion, four technologies will have a very strong impact in the future:

Multidisciplinary Design Optimization and automation. Performing, separately, parametric


modelling and simulation analysis as CFD may become a time-consuming practice, leading to
delays in product development. By relying on simulation process integration and automation
technology included in software such as ESTECO modeFRONTIER, it is possible to seamlessly
integrate the most popular third-party engineering solvers into a unique, automated simulation
workflow. This allows an engineering analyst to automatically run repetitive simulations and
avoid the painstaking process of manually combining the output from multiple applications.

Simulation Process and Data Management (SPDM). When companies use digital technologies
to innovate engineering design processes, there is a need for a reliable platform that helps run
multidisciplinary simulations and manage huge amounts of data efficiently. With VOLTA,
ESTECO Enterprise platform for SPDM and design optimization, it is possible to scale up the
usage of simulation models and design exploration and optimization techniques across teams
and different organisations to deliver better products, faster.

AI and Deep ML. When approaching the design of a new product, knowing upfront the
performance potential and the development time is key to stay competitive. At the same time,
an in-depth understanding of the design space is reached with the development of a set of
optimized physical or virtual prototyping experiments, validating the realistic product
performances. ESTECO Autonomous Optimization approach, based on Artificial Intelligence,
guides the user to the optimal solution freeing time and resources to focus on value-added
tasks. pilOPT proprietary algorithm encloses multiple numerical investigation strategies to offer
a smart exploration of the design space within CFD. Designers benefit from search capabilities
in multiple scenarios in the explorative concept phase; when little knowledge about variable
behaviour/problem characteristics is available and when resources, both in terms of
computational capability and time slots, are scarce.

Multi-physics, meshless methods, either particle-based or not. Although at this time most
commercial or open-source meshless solvers are particle-based, other approaches will probably
gain popularity, for instance, Radial Basis Function-Finite Differences (RBF-FD).

Like • Reply

53
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Michael Barkhudarov & C.W. Hirt | Flow Science


Quantum computing has advanced some recently, and for some special problems has shown great
promise. For the vast majority of CFD problems, however, there will have to be many more quantum
particles introduced and controlled to represent all the complexity of real fluid dynamics. This area will
require considerably more work.

Like • Reply

John Chawner | Cadence


We probably shouldn’t center the discussion around one presumed key technology because we know
today that progress in several areas is needed to achieve all our goals. Advancements are needed in the
exploitation of HPC resources, physical modeling, numerical algorithms for accuracy and robustness,
geometry modeling and mesh generation for process automation, knowledge extraction and
management, and multidisciplinary frameworks and applications.

Like • Reply

Dean Palfreyman | SIMULIA


Looking forward 20 years, I don’t think any single technology will have the biggest impact; rather,
addressing a few core areas will be more meaningful:

• Highly automating the process of preparing a model for simulation will be critical to expanding the
use of CFD to most large geometric assemblies and accelerating the total turnaround time for a
simulation to affect the design process.

• Harnessing existing numerical technologies (RANS, LBM, etc.) into one workflow will be critical to
exploiting the best-in-class method for the problem. Additionally, building robust machine learning
algorithms to guide a user for a particular workflow and providing much deeper insights into large
amounts of data will be essential to democratizing simulation to new users as well as accelerating
the design process through ‘learning’ of past simulation results and converging faster on the
optimal design.

Like • Reply

54
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Vendor Viewpoint

NAFEMS
...
How will we ensure simulation predictions are fit for
purpose? How will this change in the next 20 years?

John Chawner | Cadence


Verification and validation are the responsibility of every CFD user today. We all have to do a better job of
ensuring that we are getting the right solutions to the right equations for our particular application. This
will involve partnerships between practitioners and software providers but also professional
organizations like NAFEMS who can help coordinate across all stakeholders.

Like • Reply

Massimo Galbiati | Particleworks


Validation of new CFD methods and models will have to go through comparison with real prototypes and
with data from the real products, this will not change, but the definition of prototypes for software
validation and the development of new CFD models will dynamically benefit from each other. The
software developers/vendors will work more closely with the users and both will benefit from data
exchange and open collaboration between the people in the field, in the lab, and the engineers using and
developing the software.

Like • Reply

Enrico Nobile | ESTECO


Verification and Validation (V&V), and code verification, will become more and more important if CFD –
and it should – becomes a common technology in many different fields of our society.
However, V&V is a costly activity, sometimes overlooked, and this is a major problem, e.g., validation still
requires experimental tests, and the cost, as is well known, is continuously increasing.

Like • Reply

Michael Barkhudarov & C.W. Hirt | Flow Science


Publicly available databases of Verification & Validation results in a standardized format would be very
helpful. They would allow engineers to improve their codes and users to compare and evaluate different
codes. AI could help with code verification.

Like • Reply

55
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Simon Fischer | Siemens


Certification and verification processes for CFD simulation tools are well established. They will remain a
critical ingredient to the progress of CFD, its reliability and trust in digital twins, and its establishment in
novel areas. While predictive simulation will continuously reduce the need for expensive measurements
and prototyping, it will continue to require rigorous CFD methods and best practice validation through
experiment.

Compared to “traditional” CFD simulation, AI / ML-based fluid dynamics prediction will require significant
validation. Due to the lack of the abstraction level, i.e., a governing equation, analytical description, etc.
AI and ML need engineers’ judgment to probe for critical areas outside the training range in certification
processes.

Like • Reply

Dipankar Choudhury | ANSYS


Verification and validation of CFD results will remain critical in ensuring the reliability of CFD solutions.
We see promising new methods in uncertainty quantification (linked to deep solver methods such as
adjoints and sensitivities) that could potentially provide users with powerful tools for gauging confidence
levels in CFD simulations. Organizations like NAFEMS can play a key role in the process of improving fit-
for-purpose CFD simulation predictions by providing forums and mechanisms for discussing and
codifying industry best practices for CFD simulations, dissemination of information, and supporting
education in CFD.

Like • Reply

Dean Palfreyman | SIMULIA


Fitness for purpose must go beyond the functional use of the model to include its management, the
problem, and the project and industry context. The modeling must be useful and meet the end user's
needs to achieve an acceptable level of confidence.

From a process perspective, embedding industrial best practices – mesh criteria, numerics, turbulence
models, etc. – for a given simulation within the CFD software coupled with understanding a user’s intent
as they build a simulation, will allow the CFD software to provide guidance to ensure the simulation
attributes are defined correctly.

Continued validation and verification of CFD, particularly as users move toward more complex
multiphysics problems, will be essential to ensure accuracy to meet customer needs, particularly critical
as customers want to move toward full digital certification of products and systems, hence the need to
meet the stringent accuracy requirements to replace physical tests.

Like • Reply

56
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Vendor Viewpoint

NAFEMS
...
What will be the application areas where CFD might
contribute most in the next 20 years?

Dipankar Choudhury | ANSYS


The global push for much greater environmental friendliness of products and processes and
sustainability should lead to CFD becoming even more critical in many of its classic application areas,
such as aerodynamics, propulsion, and power generation of all sorts, as new technologies are developed,
refined, and ready for the market in the shortest possible timescales. Also, the application of CFD to
environmental modeling will continue increasing in applications such as impact studies ( of hazardous
releases, thermal plumes, etc.) and the design of mitigation equipment. Other high-growth areas include
applications in healthcare and medical devices and equipment design, as well as health and safety
analysis. Finally, high-tech areas such as electrification, autonomous vehicles, and 6G will see large
growth in CFD modeling and simulation.

Like • Reply

John Chawner | Cadence


It would be easy to answer with “planes, trains, and automobiles” because those application areas are
where CFD has traditionally been strong. There’s no reason to assume those strengths will diminish
especially as we are at the beginning of a paradigm shift on both planes and automobiles toward
electrification and energy-efficient platforms in general.

Biomedical applications including both devices and patient-specific scenarios are a potential area for the
growth of CFD, especially for pulmonary and hemodynamic applications. CFD can potentially deliver a lot
of benefits as long as some of the challenges discussed above are overcome such as complex geometry,
complex flow physics, and unsteady flows.

Like • Reply

Dean Palfreyman | SIMULIA


CFD is contributing heavily to industries such as Transportation & Mobility, Aerospace & Defense, and
High-Tech. The main drive will include much higher fidelity methods – higher fidelity being spatial and
temporal fidelity but also inclusion of more physics and geometric complexity. We also expect a
proliferation of CFD simulation in other industries such as construction/building, cities, life sciences,
safety, and health care, to name a few.

Like • Reply

57
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Massimo Galbiati | Particleworks


CFD will contribute most in the development of greener products and for this reason, the greatest impact
will be in the Energy and Transportation areas, two of the industrial sectors that need a revolution. CFD
will contribute to that revolution.

Further democratising the capability by developing UIs that allow many stakeholders to ask ‘what-if’
questions will also be key, i.e., the model is an asset that belongs to the company and not the modelling
team; thought needs to be given to how we provide sensible access to different stakeholders in the
company (strategists, finance, etc.)

Like • Reply

Enrico Nobile | ESTECO


One of the most promising areas where CFD will probably play a major role in the future will be medicine:
precision/personalized medicine ( cardiovascular engineering and medicine, image-based CFD for
respiratory medicine, image-based CFD and FSI analysis in vascular diseases to name a few), medical
devices (design and in-silico testing of medical devices), and pharmaceutical CFD (e.g. drug delivery
devices).

Like • Reply

Michael Barkhudarov & C.W. Hirt | Flow Science


Climate/ocean modeling, environmental systems, including human and other biological factors.

Like • Reply

Simon Fischer | Siemens


A sustainable world and avoiding a climatic disaster cannot solely be achieved through engineering and
innovation. It also requires significant social and economic efforts. However, if humankind manages to
cope with climate change, CFD will have played an essential role in this achievement. Whatever we
humans do in our daily lives, from a technology standpoint, the (non)emission of CO2 and related climate
gases is at its heart governed by the fluid mechanics and thermodynamics of reacting flows. In almost
any industry, it will come down to solving a thermodynamics and flow problem, be it on the reduction, the
complete mitigation, or even the proactive reduction of already emitted CO2. And that implies the
extensive usage of CFD to do this on relevant time scales and within economic constraints. Whether it is
energy production, transport, infrastructure, or production, fluid dynamics simulations at unthinkable
scales will be required to find sustainable solutions as fast as possible. And hence there is no single
application or industry where CFD will have the most significant impact; it will shape a more resource-
efficient way of living across the board. And even if humanity does not manage to stop climate change
and is faced with adapting to new environmental situations, fluid mechanics will be of paramount
importance.

Like • Reply

58
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Vendor Viewpoint-

NAFEMS
...
What teaching and training will the CFD sector need in 20
years? What should the university syllabus include for
CFD?

Bipin Patel | EnginSoft UK Ltd


Universities should teach a variety of methods and approaches to solving the real-world problems of the
future rather than rigidly sticking to traditional methods. The universities are also influenced heavily by
commercial companies (e.g., Ansys, MATLAB) so many graduates only have one view of the world and
think that these tools are gospel.

Having more labs that encompass a variety of tools/methods would be the way forward. Vendors also
need to play their part by offering easy access to software in the cloud for students to explore. Academia
should not be seen as a revenue stream but as a knowledge development and exploration hub.

Like • Reply

John Chawner | Cadence


“It's tough to make predictions, especially about the future,” said Yogi Berra. With that caveat out of the
way, a lot of CFD for typical engineering products will be automated and integrated into CAD systems or
other software frameworks. So when students of the future learn how to use CAD software to design a
part, there will likely be a button that lets them run CFD on it quite easily.

The challenge this scenario presents is ensuring that students are able to use their evolving engineering
knowledge to assess whether or not the results computed are realistic. In my role on my alma mater’s
advisory board, I repeatedly advise that an undergraduate engineering education is not a trade school in
which students learn how to use specific software packages. They need to learn the fundamentals of
fluid dynamics and solid mechanics and thermodynamics to have the core knowledge they need to
assess the results of computer simulation. A case can be made that they need to learn programming
also to develop an appreciation for what goes into writing a CFD solver and all the different ways a solver
can fail.

And there will always be graduate courses in CFD development followed by research opportunities
involving the leading edge of the current state of the technology.

Like • Reply

Dean Palfreyman | SIMULIA


Dedicated CFD university programs should focus on; physics and fundamentals so students can
interpret results; geometry preparation– CAD tools are essential; a solid foundation in programming,
preferably C++/python; a strong foundation in numerical methods and CFD algorithms; and detailed
consideration for physics modeling pulling together these other aspects.

Broader than this is the use of simulation (all physics) in university syllabuses at all education levels to
help students understand numerical fundamentals and provide a deeper insight into physical
phenomena. Data analytics, artificial intelligence, and machine learning methods will help students gain
a deeper understanding of these emerging technologies as they apply to simulation.

Like • Reply

59
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Enrico Nobile | ESTECO


From my personal experience – I have been teaching a full-semester CFD Graduate course
for the last 15 years – I think there are a few key points that should not be overlooked, even
when web-based, very easy-to-use (e.g., democratized) apps are available. These should be:
• fundamentals of common discretization methods in CFD (FEM, FV): major
advantages and limitations;
• meshing for mesh-based methods: quality indicators, structured and unstructured
(hybrid) meshes;
• meshless methods: fundamentals, typical use, advantages and limitations;
• modelling topics, to be selected according to the interest: turbulence, multi-phase,
combustion, chemical reactions, etc.;
• meaning and role of Verification and Validation (V&V) – they are frequently confused
and/or misunderstood);
• type of errors and uncertainties;
• best practices, guidelines, and codes of standard for V&V.

Like • Reply

Dipankar Choudhury | ANSYS


Overall, many studies show that most universities do not adequately prepare engineering
students for the familiarity and use of engineering simulation even though they graduate
and begin work in industries where the use of CFD is pervasive. The good news is that most
progressive curriculum committees in engineering schools have recognized this and are
working to introduce CFD even in the early years, for instance in Introductory Design
courses and then progressively enriching theoretical and experiential learning of CFD
through later years in fundamental physics courses (such as Fluid Mechanics and Heat
Transfer), culminating in senior design or capstone courses and courses focused on
computational engineering and best practice. Concurrently, there is a significant increase in
the availability of professional certificate programs for continuing education as well as
online courses for self-learners. An example of this is the Ansys Innovation Courses that
are provided online to all.

Like • Reply

60
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Vendor Viewpoint

NAFEMS ...
2 April 2019

What will happen in the next 50 - 100 years?

Simon Fischer | Siemens


Any yet unforeseeable hardware technology disruption that lifts parallel computing onto a new level will
have massive implications for CFD and its impact on engineering.

Undoubtedly, the Navier-Stokes equation is a brilliant piece of math and physics. Given its impact on the
way we can predict the motion of fluids and, based on that, engineer products, it is an outstanding
success. But, deep down we all know that it remains an approximation. Describing the motion of fluids
as a continuum is after all a non-analytical description of reality and hence from a theoretical standpoint
poses the risk that we miss fundamental elements that result from the complex interaction of billions of
molecules. In all honesty, we often forget about this approximation, however, it is both an obvious and
impactful one. No doubt, his was the historical achievement of Navier: merging the pragmatic
hydrodynamics engineering world with the theoretical fluid dynamics through the introduction of a
viscosity concept. An ingenious idea. But the fundamental underlying continuum assumption for fluids
forced us to introduce concepts like turbulence and viscosity to overcome the deficiencies in the Navier-
Stokes equation and make it of some practical usage.

Now, obviously with sufficient computational power one could solve Newton’s equation for each and
every molecule and, provided we know the intermolecular forces, predict the bulk motion. Today's
computational chemistry codes do exactly that, even going beyond length and time scales where
quantum effects become relevant. But even within the next 100 years it might be a bit naïve to expect
this to be a feasible solution for external aerodynamics of a car.

But, what if some day in the coming 100 years someone comes along, with something that fits in
between? Something that closes the gap between molecular dynamics and continuum descriptions,
something that absorbs Navier-Stokes into a more generalized, more complete, and more powerful
description of fluids and puts the description of turbulence on analytic grounds rather than modeling
attempts…?

Thanks to all the people that helped me to gaze into the crystal ball for the future of CFD on behalf of Siemens
through great and insightful conversations, comments, and their publications: Patrick Niven, Boris Marovic,
Jens Prager, Bastian Thorwald, Samir Muzaferija, Christina Kothlow, Justin Hodges, Sylvain Lardeau, Ravi
Shankar, and Jean-Claude Ercolanelli.

Like • Reply

Bipin Patel | EnginSoft UK Ltd


• Complex models will be used in apps for anyone to ask ‘what-ifs’
• Increased use of Digital Twins utilising CFD e.g., changing parameters via the cloud of an EV to
maximise efficiency based on the operating conditions
• Homes could be continually optimised for energy requirements and performances

Like • Reply

61
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Enrico Nobile | ESTECO


In 50 years:
• cloud-based, accessible everywhere from every device (PCs, tablets, smartphones) CFD apps,
corroborated by ML features that will assist/guide the user in all modelling aspects i.e. domain
and model(s) selection, BCs, material properties, etc.;
• proper post-processing and data/feature extraction;
• transparent management of process and data from simulations.

In 100 years: multi-physics multi-disciplinary cloud-based (maybe hosted on Quantum Computers); real-
time CFD accessible everywhere from every device, in particular, wearable devices (google lenses), with
the capability, enabled by AI and ML, to recognize and reverse engineer systems and environments in
front of the user; advanced augmented/artificial reality.

Like • Reply

Michael Barkhudarov & C.W. Hirt | Flow Science


• More online, on-demand CFD tools becoming available, tailored to specific applications.
• Different levels of solution fidelity within the same tool – from quick estimates to detailed
calculations.
• Seamless data sharing.
• Dramatic cost reduction of the tools.
• Commercially, CFD will survive only as an integral part of larger design and development tools.

Like • Reply

Dean Palfreyman | SIMULIA


In the next 50 years, we’ll see a massive proliferation of CFD across all industries. We see almost
complete virtual certification of products. CFD will be used to provide deep insights into physics
phenomena in industries like Life Sciences, where CFD will play a key role in personalized medicine; for
example, scans of human anatomy will be meshed and simulated to identify potential health issues.
Cities are another area of focus, where city planners will use CFD to optimize a city’s layout better to
harness wind, sun, and other natural resources to reduce energy consumption.

In 100 years - a complete digital twin running in parallel with live monitoring to predict the future. The
world in 100 years will be based on "What decision can I make by predicting tomorrow's scenario using
today's data." It’s a multiparametric approach to support the next-minute decision using live data.

Like • Reply

62
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Vendor Viewpoint

Dipankar Choudhury | ANSYS


In the next 50 years, we should see the emergence of novel architectures and novel algorithms that take
advantage of them– such as quantum computers– provide a massive jump in compute capacity and
throughput. With that, we should increasingly see CFD being used pervasively, with higher precision, and
with response times that allow embedding in products, control systems, and VR/AR systems. We should
also see a much higher degree of multidisciplinary optimization studies and prevalent use of machine
learning methods deep in the heart of CFD software systems. Finally, we should see higher precision
mathematical models and solution methods that combine multiple scales (such as the combination of
discrete and continuum methods) thus enabling significantly better predictions for multiscale and
multiphase flows.

In 100 years - DNS for widespread industrial use perhaps? Today, DNS is not practical except for simple
problems and flow conditions at lower speeds. If the trend of computing processing power doubling
roughly every two years continues for the foreseeable future, we should have the wherewithal to tackle
even complex industrial cases.

Like • Reply

John Chawner | Cadence


Some technologies advance in leaps while others seem to barely make progress. What’s interesting to
note based on my career is that a 2D simulation I did right out of school at my first job took me a year
and it never gave good results. The same simulation would probably be an undergraduate homework
problem today. In 100 years, the cop-out is to simply hand-wave that everything will be automated. After
all, it was only 66 years between the Wright brothers' first flight and the Moon landing. n

Like • Reply

Climate, Carbon, and CFD


As we’ve outlined in this issue of Benchmark, numerical models to simulate the weather were one
of the first uses of high-performance computing for everyday decision making. Today, the
descendants of these models are used to make daily decisions affecting our social and economic
well-being, as well as safety-critical activities. The climate versions of these models are being used
to understand our impact on the environment. They have profound implications for the decisions
that we make about future energy supply and other economic and technological development. The
weather and climate community has made significant progress in improving predictions and
quantifying uncertainty in recent years. The improvements are driven by a combination of factors –
better understanding of the science, the ability to represent that science in computer models, and
the rapid improvement in computer power allowing ever greater detail to be represented in the
models. In the battle to slow climate change and move to net zero, CFD and its predictive
capabilities will become ever more important.

nafe.ms/CFD100-9

Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

YOUR SOURCE FOR


Technolo
e ogy for Optimal
O
Engineeering Design

www.digi
gitalengineering24
g g 47.com
D g Engineerin
Digital g ngg Online Maggazinne Subscription
p
DE247 Newswire
w Enewsletter SSubscription
p

Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only
Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

WHAT IS THE SIMULIA COMMUNITY?

MEE
TT
KS HE
T AL C

HA
CH

MP
TE

ION
S
EV
EN
T

RE A
PL
AY Q&
S

JOIN THE SIMULIA COMMUNITY!


Join a community of simulation enthusiasts focused on advancing the use of SIMULIA
simulation solutions in science and engineering! It’s free and easy to join.
Start a discussion with other members of the SIMULIA Community. Talk through your
simulation questions with peers, SIMULIA experts, and SIMULIA Champions. Apply to be
an author to create posts, share useful tips for SIMULIA software, and establish yourself
as a thought-leader. The SIMULIA Community is home to both SIMULIA product users and
SIMULIA subject matter experts around the world.

Visit go.3ds.com/simc

Copyright Nafems 2022 Licensed solely to Henrik Nordborg (henrik.nordborg@ost.ch) for single use only

You might also like