You are on page 1of 21

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/220277172

Parallel 3D Computation of Unsteady Flows Around Circular Cylinders

Article  in  Parallel Computing · September 1997


DOI: 10.1016/S0167-8191(97)00050-1 · Source: DBLP

CITATIONS READS
55 60

2 authors, including:

Tayfun Tezduyar
Rice University
695 PUBLICATIONS   30,762 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Parachutes View project

Space-time computational analysis View project

All content following this page was uploaded by Tayfun Tezduyar on 12 October 2017.

The user has requested enhancement of the downloaded file.


V. Kalro, and T. Tezduyar, “Parallel 3D computation of unsteady flows around circular
cylinders”, Parallel Computing, 23 (1997) 1235–1248,
http://dx.doi.org/10.1016/S0167-8191(97)00050-1

Corrected/Updated References

1. M. Behr, A. Johnson, J. Kennedy, S. Mittal and T. Tezduyar, "Computation of Incompressible Flows


with Implicit Finite Element Implementations on the Connection Machine", Computer Methods in
Applied Mechanics and Engineering, 108 (1993) 99-118.

2. T. Tezduyar, S. Aliabadi, M. Behr, A. Johnson and S. Mittal, "Massively Parallel Finite Element
Computation of Three-dimensional Flow Problems", Proceedings of the 6th Japan Numerical Fluid
Dynamics Symposium, Tokyo, Japan (1992).

4. T. Tezduyar, S. Aliabadi, M. Behr, A. Johnson and S. Mittal, "Parallel Finite Element Computation of
3D Flows", Computer, 26 (1993) 27-36.

5. V. Kalro and T.E. Tezduyar, "Parallel Finite Element Computation of 3D Incompressible Flows on
MPPs", Solution Techniques for Large-Scale CFD Problems (ed. W.G. Habashi), John Wiley & Sons
(1995); also in Proceedings of the International Workshop on Solution Techniques for Large-Scale CFD
Problems, Montreal, Quebec, Canada (1994).

6. S. Mittal and T.E. Tezduyar, "Massively Parallel Finite Element Computation of Incompressible
Flows Involving Fluid-Body Interactions", Computer Methods in Applied Mechanics and Engineering,
112 (1994) 253-282.

7. T. Tezduyar, S. Aliabadi, M. Behr, A. Johnson, V. Kalro, and C. Waters, "3D Simulation of Flow
Problems with Parallel Finite Element Computations on the Cray T3D", Computational Mechanics '95,
Proceedings of International Conference on Computational Engineering Science, Mauna Lani, Hawaii
(1995).

8. V. Kalro and T. Tezduyar, "Parallel Sparse Matrix Computations in Finite Element Flow Simulations
on Distributed Memory Platforms", Proceedings of the International Conference on Numerical Methods
in Continuum Mechanics, High Tatras, Slovakia (1996).

10. T.E. Tezduyar, "Stabilized Finite Element Formulations for Incompressible Flow Computations",
Advances in Applied Mechanics, 28 (1992) 1-44.
._\s PARALLEL
H--E
niDil COMPUTING
€,{-l

ELSEVII]R l o r n p u r i n g2 3 ( t 9 9 7 ) 1 2 3 5 - 1 2 . 1 8
ParallcC

Parallel3D computationof unsteadyflows around


circularcylinders
-.
V. Kalro T . T e z d u y a 'r
DepartmentofAerospace Engineeringand Mechanics,Army High I'erformanceComputingResearchCentur
Uniuersityof Minnesota, | 100 WashingtonAuenueSouth,M inneapolis.MN 554I 5 , t lSA

R e c e i v e d2 2 A u g u s t 1 9 9 6 :r e v i s e d1 8 F e b r u a r y1 9 9 7

Abstract

In this article we prescntparallel3D finite elementcomputationof unsteaclyinconrpressible


flows around circular cylinders.We employ stabilizcdfinitc elcmenttbrmulationsto solve the
Navier-Stokesequationson a thinking machineCM-5 supcrcomputcr. Thc timc intt-grerionis
basedon an implicit method,and the coupled,nonlinearequationsgeneratedeverv trme stepare
solvediteratively,with an element-vector basedevaluationtechnique.This strategyenablesus to
carry out thesecomputationswith millions of coupled,nonlinearequations,and thus resolvethe
flow f-eaturesin -ereatdetail. At Rcynoldsnumber 300 and 800, our resultsindicritestrong 3D
featuresarising from the instabilityof the columnar vortices lbrrning the Karman street.At
Re: 10000 we employa largeeddy simulation(LES) turbulenc-e modcl.O 1997ElsevierScience
B.V.

problcrrls:
Keltwords:I-arge-scale Parallclcomputationst
3D cy)inderflowsl l-ES

l. Introduction

Large-scaleflow simulationshavereacheclto levelsunimaginableonly a decacleago,


owing to the major advancesin high performancecomputing(HPC) technology.Moderr
supercomputingplatforms hamesshundredsof widely available computer chips to
provide tremendousresourcesboth in terms of raw computing power and memory.
Theseworkhorsesare aptljvtermedmassivelyparallelprocessors (lvtpp). It is believed
that in the near future thcsc nrachineswill be capable of sustainingTeraFI-OPs
perfbrmance.

'
C o . . * r p o n a i n ga u t h o r . l ' e l . : + I 6 1 2 - 6 2 6 8 0 9 6f:a r : + l - 6 1 2 - 6 2 6 1 5 9 6e; - n r a i l :k a l r o @ a h p c r c . u n r n . c c l u .
'
E - r n a i l :t e z d u y a r @ a h p c r c . u m n . c d L r .

0 1 6 1 - 8 1 9 1 / 9/1S 1 7 . 0 0O 1 9 9 7t r l s e v i cS
r c i c n c cB . V . A l l r i g h t sr c s c r v c d .
P 1 1S 0 l 6 7 - 8 r 9 l ( 9 7 ) 0 0 0 5 0 . l
1236 V . K a l r o , ' l . T e T d u y a r / P a r a l l e l C o m p u r i n g 2 . l( 1 5 9 7 ) 1 2 3 5 l l 4 x

The programmingstyle on MPPs variesextensivelyfrom machineto machine,thus,


advancesin developmentof efficientparallelalgorithmsare vital for optimal utilization
of the new HPC technology.A single-instruction-multiple-data (StVtO) style of pro-
grammingwithin a data parallelparadigmis describedin Behr et al. Il]. A number of
flow simulationsusing relatedfinite elementtechniqueswere reportedrccently[2 6].
Simulationsutilizing multiple-instruction-multiple-data (MIMD) programmingstylesare
presentedin Refs. [7,8]. Our computationshere are basedon stabilizedfinite element
formulations, which were describedearlier [5]. These stabilizedtechniques[9,10]
suppressthe numerical oscillationsin the field variables by introducing minimal
numericaldiffusion rvhile maintainingconsistencyin the sensethat the stabilizersare
w c i g h t e dr e s i d u a l o . o n l i n e acr q u a t i o n sg e n e r -
s l t h e g o v c r n i n gc q u a t i o n sT. h e c o u p l e d n
ated at every time step are solved iteratively, with GMRES [11] updatetechnique;and
by computingthe residualsbasedon matrix-free (element-vector based)evaluations.
While the new technologyhas empoweredus to attempt very large-scale simulations
around fairly complex geometries, the issue of turbulence is yet to be completely
resolved.With direct numericalsimulation (DNS), even the most powerful machines
availabletoday can handle only very moderateReynolds numbers. Thus one has to
resortto turbulencemodelsto accountfor the subgrid-scale structures in the flow. In this
paperthe Smagorinsky[12] large cddy simulation (LES) model is usedto represent the
turbulenceeff'ects.This model is used here in its simplest form with a constant
coefficient (the interestedreader is referred to Germano It 3] and Moin [14] for
discussions on more sophisticated dynamicsubgrid-scale rnodels).
We present3D finite elementcomputationof unsteady incompressible flows arounda
classicalgeometry,namely the circularcylinder.These flows are studied for a number
of reasons.Firstly,a largeamountof experimentaldatais available for the cylinder;this
aids in validatingthe accuracyof the formulationsand the variousmodelingassump-
tions.Secondly,althoughthis geometryis simple,the associated flow fields are enriched
with fundamentalphenomcnawhich fbrm the building blocks of more complex flows.
Hence, base florvs like thcsc continue to attract the attention of iluid mechanics
rcsearchers.
Until recently most numerical simulationshave been restrictedto 2D cases(see
Mittal [15] tor example)due to lack of computationalpower, whercas,experiments
reportedindicatethe presenceof strong3D featuresaboveRe - 200. Recentlytherehas
been renewedintcrestin the 3D nature of the cylinder wakc [16 18]. With the HPC
tools we now possess,it is possibleto resolvethe flow featuresat moderatelylarge
Reynolds numbers with sufficient detail. We investigatethe unsteadyflows past a
s t a t i o n a r cy y l i n d e ra t R e y n o l d sn u m b e r3 0 0 a n d 8 0 0 . A t R e : 1 0 0 0 0 ,o u r s i m u l a t i o ni s
basedon the LES model mentionedabove.
All simulationsare computedon the thinkin-emachineCM -5.

2. 3D flows at low Reynoldsnumtrers


At low Reynoldsnumbcrsone observessymmetricattachecl ecldiesaft of the cylinder.
At around Re : 40, the rvakebecomesunstablc and begins to oscillate.Theseoscilla-
tions roll up into discrele vortices which are periodicallyshcd when Re > 60. This
v. Kallo, T. Tezdutat/ Poauel conputiq 23 (1997)1235-1248

Fig. l Meshfor conFting flow at Re= 3OOed 30o(19?9.lNnod€sand 186240eleme s) (r- ) Ploe in
top inage ed ' z plane in bolton inase).

unifon y spacedtail of vorticesis refered to as the Karman vonex steei. Rosbkoh9l


fiIst rcported a detailed measurcmenlof the Strouhrl number coffespondingto the
vortex sheddingfor a wide ra,rge of Reynolds numbers. Since then researchershave
reportedfrequencieswhich differ as much as 20E; further, a disconlinuity in the St Re
correlation has also been recorded. Williamson [20] attribules ihese findings io the
boundaryconditionsat the cylinder ends,and the 3D effects which appearat Re - 190
He observedbotb oblique andparalel vonex sheddingmodes,dependingon whetherthe
centralflow could match up witi the end conditions.By using endplaies,he was able to
coercethe flow to obtain parallel vonex shedding.
In simulationsdiscussedhere,the cylinder spansthe entire crossflowdomain,and the
boundarycondi.ionsusedenforceparallel shedding.Cornputationsal Re :300 and 800
were performed on a mesh of hexahedralelements(see Fig. l). The diameterof the
cylinder is 2 unils and the lengrhis 8 units (this length was chosenafter sometrials so as
to captue a few wavelengthsalong lhe cylinder axis). The mcsh extends 15 units
upstream, 60 units downstream.and 30 uDits in the crossflow direction. The mesh
consistsof 186240elements (4656etements in a 2D seclion,4{ alongthe spd of the
cylinderand 80 along irs circumference) and 197948nodes,and resultsin 760107
equations.The mdial thicknessof the first layer of elementsis 0.01 units. The boundaly
conditionsconsistof uniform inflow velociry, zero-nomal velocity and zero-shear_stress
at the lateml boundaries,traction-fteeconditionsat the oudlow boundfiy, andno-slip on
the cylinder.
For thesesimulations the time stepis setto 0.05 andthe inflow velocityto 1.0.The
numberof nonlineariteralions at eachtirne step is 3, and the size of the lcylov spaceis
10, with no restans.Thesecomputations require13.7s/time stepon a 256 PN CM-5.
1238 / Ptallel Conputins23 (1997)1235-1248
v. Kalro,T. TezdLtar

!is. 2-FlowatRe: 300:tine'hisbnesof lhefoEecoefficienls.

Recenlly williamson [18] infened the pesence of two instabiliticswhich are temed
mode A (appearing at Re - 190) ard mode B (at Re - 260). In mode A, the instability
arises due to the amplification of spanwise disturbances by the strain rate fields
sunounding the vortex corcs. ln mode B, ahe structures arise from inslabilities associated
with rhe vorticity layer between successive cores. The node A instability is marked by
streamwiseloops of vorticity of wavelength - 4.0D; fte nodc B instability is con
prised of strands of streamwisevonex pairs with characteristicwavelength - 1.0D,
where D is the cylinder diameter. The value for Saouhal number from present
computationsand lhe topology of the wake sn'uctureat Re - 300 resemblesmode B.

2.1.Flow at Re :340

This computationwas advancedin time untii it was obsened thal the spatially avcr
aged aerodynamic coefficients reached a periodic state (see Fig. 2). Comparisons are
made with iime-historiesof force coefficienrsobtainedfiom 2D sjnulatioD (see Fig. 3)
over a iime intewal sprnning 1500 time steps. The agreemenlbetween 2D and 3D
sirnulationsis good since lhe 3D features are weak and the votex des de nedly
alisned with the cvlinder axis in rhe near wake. The Strcuhal number from thc 2D

I g r I o- JrP. - .00 a e ' i n ' n e .o l r h . r r . co.|'.'enr .ron/D nnLL 'or..


v. Kdtto, T. Teatrtw / Pararel Convutiry 23 (199) 1235-1244

II
ii,

Fig. 4 Flow ar Rc - 3m: naenibde of the voniciry ar equally spacediNhts durlng a shcddirB penod
(flms l, 2, 3, a on the left and 5, 6, 7, 3 on rhe risbt

I
simulation is 0.202- In the 3D simulation, a Karman vortex street is observedin the
periodic state,and the conespondingStrouhalnurnberis 0.203 (experimenis - 0.2),
Visualization of the voticity isosurfaces(seeFig. 4) showsthe sepantedshearlayer
with tire a-ipearanceof secondaryinstabilities in the forrn of spanwis€waves.Approxi-
mately 4-5 waves are captured with lhe present Jnesh. These waves increase in
amplitude downstream,resulting in the releaseof wavy columnd vor.tices-The frames
taken at different instanh during a shedding period reveal the process of vo ex
lbrmation and associaiedinstabiliries.
V. Kalrc,T. Te4luyot / Pa.a\el Conpdi,B23 (1997) 1235 124A

Iig. 5. Flow at Re:300: tine hisbnesand powerspech of u', v', w ar a poi.r downstcamof dre
cylinder along rhe cendlinc.

To get a measureof the 3D effects,the time hisioriesof the velocily and pressureat
2.85 units downstreamof the cylinder along the cenierline(seeFig. 5) are traced.Nore
thatthe 'v' component of velocitybasa liequencyhalf that of 'u' which ;s due to the
ahemaiing signs of vortices (this is also the reason irhy lhe force coefficient in ),
direction has double the period of the drag). Furthermore,rhe power spectrashow the
existenceof two dominantfrequencies for 'w'; one of them coffesponding to the
sheddingfrequencyand the olher twice tha!.
Fig. 6 shows the flow vectors in differenr spanwiseplanes(parallel !o the cylinder
a\is and perpendiclrlarto the free stream fiow). We can observe the appearanceof
regularlyspacedcels at .y/d: a.60.Thesecellsaresplit into pairsof weakstreamwise
counterrotatingvonices.Thcscrcsenbl€the Taylor voiices seenberweenconcentric
rotaling cylinders and are also reported in Ref. [2i]. The wake strucnrre rclains its
regularltyfuriherdownstream ('/d = 6.85),boweverthe cellsare elongaled, plausibly

|,li;:
rtl
IIf'lll'itllifiiii
1, ,l
, rf
I
!ig.6. Flowal Re: 300:v.lociryv..io( in spanwise
planesat r/d= 4.60(Iefd.6.35(centcrl.I1.36(righr).
|. Kaba, T. TeTlulal / Patattet c.hputinE 23 (t997 ) 1235 1218 t241

Nlli-r,,^1il',^ri
!is. 7. Flowat Re: 800itine hisroricsof thelorcecoeificienls.

due io ihe widening dislalce belween the vortex cores. Further downstreamai r/l =
11.36 the wake structurc is quire different and one can observc the araangementof
'cross' formation-
streamwisevortices in a

2.2.Flow at Re: UA

The time bisbnes of the fbrce coefficientsfor this case are shown in Fig. 7. These
_l'hc
are no longer uniform as those measuredat Re:300. 3D features are stronger-
tirne hislories of the force coefficients are compared over 2500 timc steps with those
obtainedfrom 2D simulation (Fig. 8). Since at higher Reynold numbers the boundary
layer is sharper,the velocily gradicnts are larger, resulting in lhe r€lcase of stronger
vofices. As a result, the anplitudes of lhe force coefficienisfor the 2D sinulation are
larger than those a! Re: 300. However this is not seen in tlre 3D sin1ulationsince ihe
vorricesare significetly disbned and possesscomponentsbesidesthal in the spanwise
direction. The Slrouhal number from 2D simulation is 0.217, this is higher thd that
reponed in experiments(- 0.2).
Fig. 9 shows the isosurfacesfbr two different vorticity values.There is a breakdown

ililiI ilii[illr[1i
llllliliiiilrl
t!ilxlt
l![[[r!tjr
ol lhe lorcec@fficienrsfron2Dsinulation.
lig 8 FIos aiRc= 300:tine.histories
\ ^ot, a. 1 I "1au\atI p!,r11.1canpuna?' J I taa-\ I 2 1. I 2a8

f
ae

Fig.9. FIowarRe= 800:vonicityisosurfacsfor I ol : 0.50eir)ed 2.0Gisb0.

in the rcqular spanwise sftrcture observedat Re: 300. We can also observedistinct
strandsof streamwisevorticity (lol= 0.5). In the nght image the disrortion of spanwise
vortex cores to form loops is observed.
The inqeased 3D effects are also indica!€d by the'w'component of velocity (see
Fig. 10). The power specaaof'w' indicatesthe existenceof multiple pedks. However
the dominant frequency in lhe specarasiill corresponds to the vortex shedding frequency_
The presence of a peak at a ftequency half that of the domindr frequency is also
obsewed. The Strouhal number conespondingto the sheddingfrequency js 0.203 and
agreeswcll with experimen!.
Fig. 1l shows instantaneousvelociiy vecton in spanwiseplanes in the vicinity of
r/d: 1.24. The presenceof counterrotating streamwisevotex pairs is seen.However
lhese vortex pairs nJc not aranged in a reguiar fashion. They rnerge together and cancel

Fig. 10. Flow ar Rc:300: iihe hhrorie! and power specta of L. v', w' d a l)oint downsrEain ol drc
cylinder alonc tne cenrerlinc.
V- Kalro,T.Tez.furar/Parctbl Conputinc 23 11997) 123s 1248

.t.. .!rtirr
t,t lrti .,..ri
' : i i l l r r , l ' t : l
llri'ii,,!,iiiri,1l+ :::::.:;il;1

I ' g l l . F l o q a L R e _ 8 0 0 . ! e l o . . l v r e r o B i n s p d w b . p l d . . ' / l L l ' ( r o p l p h , .1 . 9 ( ! o p r g h t )L. ) 6


(bonom l€fd. r33 (boton riehd- Obsefle rhe dhilation of the counteFrotltins voiex pair to the nglr of

'nushmon'
each other. These shaped vonex pain are characteristics of turbulent flow

3. LES of flow around a cylinder at Re : 10000

As the Reynoldsnumberincreases, eniitieswith diminishingscalesappeari[ ihe


flow. If one desires to capture atl the flow delails withoul any modificalion of the
fonnulaiions, ver] fine grids need to be utilized. An approximationto the number of
grid pointsrequiredis givenUy -Re"5. This comesfrom the requircment ofresolving
1244 v. Kabo,T. T.zl\ar / PoraLl Co,tpttits 23 {1997) I2:J5 1243

the Kolmogoroff scale (which is the scale at which all the hrbulenl energy is
dissipated).An estimateof this scaiebasedon dimensionalanalysis,is discussedin Ref
1221.
Numerical merhodswhich attempt to resolve the fine featDresdihout ary special
treatrnent (except rnesh rcfinement) are called direct numericat simulations (DNS).
Although the supercomputingarchitecturesof today provide gigabytesof rnemory,this
js b3rcly sufficient for DNS of siinple flows. Hence, one often resots to resolving lhe
large scalefeaturesof the flow, andrepresentingihe effect of the uffesolvable scalesvia
turbulence nodels.This is Feciselythe philosophybehindLES models.In this classof
methods,spatial filtering is applied to the Navier Stokesequations,and a Smagorinsky
model is used lo re$esent the subgrid scale eddies. This results in the localized
modificaiion of the effective viscosiiyi
"""": 1c,n1' I 4 " 1--1"1, (1)
where e(u) is the stain rare tensor,q:0.15, and , is the represe ariveelement
length.
LES is unlike Reynolds averagedNavier-Stokes (RANS) sinnlations where ihe
entire range of turbulence scales is rnodeled []al and only mean flolv features are
resolved.LES simulationstypically require finer grids fi2n RANS computationsand are
able to capturethe unsteadyturbulenteffects.
The LES computationspresentedhcre were carried out on a mesh with the dinen
sioDsthe sameas tboseuscdfor the previoussimulations. Fig. 12 sbowstwo viewsof
Lhemesn.I hir meshcon.,nso. ll77b00he\aledralelemenbfl4T20elemenl5ina D
section,80 along the cylinderspanand 160 along its circumfcrence) and 1213056

nowar Re:10000(1213056 nodesand I l?7600elefen6) (r


!ia.12. Meshror compuring ) planeintoP
imaeeand r z planein bonomimage).
v - Kabo.'L'IezdLtat / Parullel Canpunas 23 (1997) 1235-12.18 t245

ilft
Iig 13. LES of flow at Re - 10,000:time hisb.ies ol thc aerodynamiccoefficie.h

nodes,andresultsin 4750535equations. The radialthickncssof the layerof elements


along the cylinder is 0.001. Boundarycondilionsare lhe same as the two lower
Reynoldsnumbercases-
The inflow velocityis setto 1.0,time stepto 0.025,andthe Krylov spaceto 15 with
no restarts.Two nonlinee iterationsare performedat everytime step.Computations
proceeded at 37.7s/time stepon a 512 PN CM 5.
Since a 3D computationon such a fine mesh is costly,an initial conditionwas
generaled by projeclinga 2D unsteadysolutioncomputedon a spanwisesection(with
v. Katrc.T, Tezartar /ParalzLconputi^s23 (t97) I2J5 1218

Fi8. l4 LES of flow at Re:10,000: yelocnyvecronin de chddwnepkres at ?/l- -4.00 (lcft. 1.75

ifMrl;rr*iiliiiiffi
ill'dfr ilii$ii
iir;lft

Fig. 15.LEs of flow ai Re: 10000:vel@iryvecrorsin spdwisepld* ar r/ d = 0.96(left), 1.24iG et,

Fig. 16,LESofnow at Re= 10000:vonicityisosofacslor lol:0.5 (lefo od 2.0Gight).

14720 elements)of fte 3D nesh. Fig. 13 showsihe time historiesof the force and
momentcoefficientsover 4000time srcps.The time,averaged dragcoefficientis 1.05
and conpareswell witb experiment[23] (- 1.12).The Sirouhalnumberis 0.204and
alsoconcuffentwirh experi'nenial
measurements l24l (- 0.2).
V. Kalro.'l'.'l'e zduvar/ ParalleLComputing23 (1997) 1235 ,1218 t241

Visualizationof the llow field in chordwiseplzrnes(perpendicularto the cylinder


axis) revealsfine-scalestructures.There is a clear demarcationbetweenthc turbulent
wake and the laminar outer f-low (see Fig. I4). We observethe turbulentformation
(recirculating)region boundedby shear layers. The shear layers roll up to produce
small-scalevorticesat the edgeof the formationzonc.Thesevorticescauseentrainment
of free-streamfluid into the recirculatingzone. The flow featuresvary significantly
along the span,markedby appearance of small-scalevorticesat varying locations.The
flow on the cylinder separatesat an angle - 80" (measuredform the leadingstagnation
point).
Visualizationof the flow in the spanswiseplanesalso bring out the presenccof the
fine-scalestructures(sce Fig. l5). These fine-scalestructuresamalgamateby viscous
action and only the large-scalestreamwisevorticesare discerniblefurther downstream.
Fig. 16 showsthe isosurfaces for two differentvorticity values.The featuresare very
similar to those from Re - 800; however the structuresare more diffuse due to the
increasedturbulence.

4. Concluding remarks

With our parallelcomputations,we were able to capturethe strongly3D characterof


the wake of a circularcylinder.There were significantdifferencesbetween2D and 3D
simulationsat highcr Reynoldsnumbers.For the 3D simulations.the drag coefficients
and Strouhal numbers were in excellentagreementwith experimentalrcsults for all
cases"includingthe LES computations.

Acknowledgements

This researchwas sponsoredby ARPA underNIST contract60NANB2Dl272 andby


the Army High PerformanceComputing ResearchCenter under the auspicesof the
Departmentof the Army, Army ResearchLaboratorycooperativeagreementnumber
DAAH04-95-2-0003/conrracr numberDAAH04-95 C 0008, the conrenrof which does
not necessarilyreflect the position or the policy of the government,and no official
endorsement shouldbe inf'ened.CRAY C90 time was providedin part by the University
of MinnesotaSupercomputer lnstitute.

References
-fezduyar.
I t ] N A .g e h r , A . J o h n s o n J, . K e n n c d y .S . M i t t a l . ' l ' . L . . C o m p u t a t i o no f i n c o m p r e s s i b lfel o r v s w i t h
i m p i i c i t f i n i t e e l e r r e n ti m p l c m c n t a t i o nosn t h e c o n n c c t i o nm a c h i n e C
, o m p u t .M c t h . A p p l . M e c h . E n g . i 0 8
( 1 9 9 39) 9 l l r { .
[2] T. Tczduyar. S. Aliabadi.
M. Behr.A. Johnson,
S. Mittal M a s s i v c ) yp a r a l l e lf i n i t e e l c n r c n tc o m p u t a t i o n
'l'okr-o.
of three-dimensional flow prob)cnrs. Proc. (rth Jpn. Nunt F l u i d D y n a m i c sS y m p . , J a p a n .1 9 9 2 ,
pp. 15 2.+.
l 3 J Z . J o h a n , D a t a p a r a l l e l f i n i t e c l e n r c n tt e c h n i q L r et sb r l i i r g e - s c a l ec o m p u t a t i o n afl l u i d d 1 ' n a m i c sP
, h.D
t h e s i s ,D e p a r t m e not f M c c h a n i c a lE n g i n e e r i n gS, t a n f o r dU n i v e r s i t y .1 9 9 2 .
1248 V. Kalro, T. Tezduyar/ Parallel Computing23 ( 1997) 1235-1248

[4] T. Tezduyar, S. Aliabadi, M. Behr, A. Johnson, S. Mittal, Pirallel finite element computation of 3D
flows, IEEE Comput. 26 (1o) (1993) 2l-36.
[5] V. Kalro, T. Tezduyar, Parallel finite element computation of 3D incompressibleflows on N{PPs,in:
W.G. Habashi (Ed.), Solution techniques for large-scaleCFD problems. Computational Methods in
A p p l i e d S c i e n c e sW
, iley, 1995.
[6] S. Mitral, T.E. Tezduyar, Massively parallel finite element computation of incompressible flows
involving fluid-body interactions,Comput. Meth. Appl. Mech. Eng. l12 11994)253-282.
[7] T. Tezduyar, S. Aliabadi, M. Behr, A. Johnson,V. Kalro, C. Waters, 3D simulation of flow problems
'95,
with parallel finite elementcomputationson the CRAY T3D, in: ComputationalMechanics Proc. Int.
Conf. on ComputationalEngineeringScience,Mauna Lani, Hawaii, 1995.
[8] V. Kalro, T. Tezduyar, Parallel sparse matrix computations for finrte element flow simulations on
distributedmemory platforms,to appearin Proc. Int. Conf. NumerrcalMethods in Continuum Mechanics
(NMCM). 1996.
[9] T.J.R. Flughes,A.N. Brooks, A multi-dimensionalupwind schcmewith no crosswinddiffusion, in: T.J.R.
Hughes (Ed.), Finite Elcmcnt Methods for Convection Dominated Flows, AMD, vol. 34, ASME, New
Y o r k , 1 9 7 9 ,p p . l 9 3 5 .
l l 0 l T . E . T e z d u y a r ,S t a b i l i z e df i n i t e e l e m e n tf o r m u l a t i o n sf o r i n c o m p r e s s i b lfel o w c o m p u t a t i o n sA, d v . A p p l .
Mech.28 1199\) 144.
[ 1 1 ] Y . S a a d ,M . S c h u l t z ,G M R E S : A g e n c r a l i z e d minimalresiduaa l i g o r i t h mf o r s o l v i n gn o n s y m m e t r i cl i n e a r
s y s t e m sS, I A M l . S c i . S t a t .C o m p u t .7 ( 1 9 8 6 ) 8 5 6 - 8 6 9 .
Ii2] J. Smagorinsky,General circulation experimentswith the primrtive equations,part l: the basic experi-
m e n t , M o n t h l y W e a t h e rR e v . 9 1 ( 1 9 6 3 ) 9 9 1 6 4 .
[ 1 3 ] M . G e r m a n o ,U . P i o m e l l i ,P . M o i n , W . H . C a b o t ,A d y n a m i cs u b g r i de d d y v i s c o s i t ym o d e l ,P r o c .S u m m e r
Workshop, Center for TurbulenceResearch,Stanford,CA, 1993.
I t + ] p . l , t o i n , J . J i m i n e z , L a r g e e d d y s i m u l a t i o ncsoomf p l e x t u r b u l e n t f l o w s , A I A A P a p e r 9 3 - 3 0 9 9 , 2 4 t h F l u i d
D y n a m i c sC o n f . , O r l a n d o .F L , 1 9 9 3 .
[15] S. Mittal, Stabilizedspacet:imefinite element formulationsfbr unstcadyincompressilblefloods involving
f l u i d - b o d yi n t e r a c t i o n sP. h . D . t h e s i s ,U n i v e r s i t yo f M i n n e s o t a ,1 9 9 2 .
I t 6 ] M . C o u t a n c e a uJ.. - R . D e f a y e ,C i r c u l a rc y l i n d e rw a k e c o n f i g u r a t i o n sA: f l o w v i s u a l i z a t i o ns u r v e y ,A p p l .
M e c h .R e v . 4 4 ( 6 ) ( 1 9 9 t ) 2 5 5 - 3 0 5 .
[17] G.E. KarniadakisG , . S . T r i a n t a f y l l o u ,T h r e e - d i m e n s i o n adly n a m r c sa n d t r a n s i t i o nt o t u r b u l e n c ei n t h e
w a k e o f b l u f f b o d i e s .J . F l u i d M c e h . 2 3 8 ( 1 9 9 2 ) l - - r 0 .
[ 1 8 ] C . H . K . W i l l i a m s o n ,V o r t e x d y n a m i c si n t h e c y l i n d e rw a k e . A n n u l R e v . F l u i d M e c h . 2 8 ( 1 9 9 6 ) 4 1 1 - 5 3 9 .
[ 1 9 ] A . R o s h k o .O n t h e d c v e ) o p m e not f t u r b u l e n tw a k e sf r o m v o r t e x s t r e e l sN , A C A R e p . 1 1 9 1 ,N A C A , 1 9 5 4 .
[ 2 0 ] C . H . K . W i l l i a m s o n ,O b l i q u ea r d p a r a l l e lm o d e o f v o r t e x s h c d d i n gi n t h e w a k e o f a c i r c u l a rc y l i n d e ra t
l o w R e y n o J d sn u m b e r s J, . F l u i d M e c h . 2 0 6 ( 1 9 8 9 ) 5 1 9 6 2 1 .
[ 2 1 ] C . H . K . W i l i i a m s o n ,T h e e x i s t e n c eo f t w o s t a g e si n t h c t r a n s i t i o nt o t h r e ed i m e n s i o n a l i t yo f a c i r c u l a r
c y l i n d e rw a k e ,P h y s .F l u i d s3 t ( 1 1 ) ( 1 9 8 8 )3 1 6 5 - 3 1 6 7 .
[ 2 2 ] T e n n e k e sH . . L u m l e y J . L . , A f i r s t c o u r s ei n t u r b u l e n c cM , I - f P r e s s ,1 9 7 2 .
[ 2 3 ] H . S c h l i c h t i n gB , o u n d a r y - l a y e r T h e o r yT,t h e d . , M c G r a w H i l l . N e w Y o r k , 1 9 7 9 .
l 2 4 l J . H . G e r a r d ,A n e x p e r i m e n t ai ln v e s t i g a t i o n o f t h e o s c i l l a t i n gl i f t a n d d r a g o f a c i r c u l a rc y l i n d e rs h e d d i n g
t u r b u l e n tv o r t i c c s .J . F l u i d M e c h . l l ( 1 9 6 1 ) 2 4 4 2 5 6 .

View publication stats

You might also like