You are on page 1of 14

sustainability

Article
Augmented-Reality Visualization of Aerodynamics
Simulation in Sustainable Cloud Computing
Myungil Kim, Sukkeun Yi, Daeyong Jung ID
, Sangjin Park and Dongwoo Seo *
Korea Institute of Science and Technology Information, Seoul 34141, Korea; mikim@kisti.re.kr (M.K.);
sky@kisti.re.kr (S.Y.); daeyongjung@kisti.re.kr (D.J.); sjpark@kisti.re.kr (S.P.)
* Correspondence: seodongwoo@kisti.re.kr; Tel.: +82-10-2485-1452

Received: 30 March 2018; Accepted: 25 April 2018; Published: 27 April 2018 

Abstract: This paper proposes visualization based on augmented reality (AR) for aerodynamics
simulation in a sustainable cloud computing environment that allows the Son of Grid Engine different
types of computers to perform concurrent job requests. A simulation of an indoor air-purification
system is performed using OpenFOAM computational fluid dynamics solver in the cloud computing
environment. Post-processing converts the results to a form that is suitable for AR visualization.
Simulation results can be displayed on devices, such as smart phones, tablets, and Microsoft HoloLens.
This AR visualization allows for users to monitor purification of indoor air in real time.

Keywords: aerodynamics; computational fluid dynamics; modeling and simulations; OpenFOAM;


Son of Grid Engine

1. Introduction
Computational fluid dynamics (CFD) uses applied mathematics and physics to calculate the flows
of a fluid, and how it affects objects as it flows past them. CFD is based on the Navier-Stokes equations,
which describe the relationships among velocity, pressure, temperature, and density of a moving fluid.
Many studies have used CFD to assess the accuracy of numerical analysis methods, but relatively few
studies have attempted to visualize the simulation results. CFD simulation results have often been
visualized using supercomputers, whereas the use of cutting-edge visualization technology, such as
augmented reality (AR), remains at the beginning stage.
AR was first used [1,2] to explain the airplane assembly process by applying additional
information that overlapped with an actual environment. AR is usually derived from virtual
environments or virtual reality (VR), and it refers to a mixture of computer-generated video and
real video to induce a feeling of reality for users by combining images, motion, animation, audio,
and information about a virtual entity in a real environment. Sometimes this technology is called
“mixed reality”, to emphasize the connection between the real and virtual worlds. Recent advances in
technology of cameras, computers and other hardware, and decrease in their costs, have enable rapid
increase in the amount of available AR content. In addition, smartphone-based mobile AR systems are
evolving to augment services, such as location information, angle determination, and to combine them
with various Internet services.
However, existing AR content technology is restricted to entertainment, games, simple images,
and video. AR technology runs in spaces that include both physical objects and virtual objects.
The content of the environment can range from entirely real to entirely virtual, so AR technology
can combine technology that is based on actual environments, and technology based on VR. VR has
been applied in industry to design and to evaluate products. However, devices like mobile phones
and smart glasses do not have the computational power to process data of diverse types or in large

Sustainability 2018, 10, 1362; doi:10.3390/su10051362 www.mdpi.com/journal/sustainability


Sustainability 2018, 10, 1362 2 of 14

volumes, so AR technology is not widely used in complex simulations. To overcome this limitation, a
service that is uses equipment efficiently but uses little energy should be developed [3,4].
Cloud computing is considered as a key feature of green IT that can be defined as the technologies
that directly or indirectly promote environmental sustainability [5]. Cloud computing technology
addresses two critical elements of a green IT approach: energy efficiency and scalability. This research
establishes a heterogeneous cloud computing environment and uses it to perform CFD simulation
and AR tracking, which usually require significant computing resources. This environment consists
of calculation nodes that can be added or deleted easily by diverse hardware and operating systems.
This system allows for the use of different nodes in hardware or operating systems, and as a result,
secures system scalability. We use Son of Grid Engine (SGE) [6] open-source job scheduler to manage
concurrent user requests and resources effectively. That is, sustainable cloud computing system’s
scalability and efficiency can help to reduce energy usage and electronic waste.
Specifically, this paper presents a method that uses AR technology to visualize simulations of
air movement within buildings. As people’s interest in the quality of life increases, air pollution is
becoming a major health issue. Detrimental to health, it leads to air quality issues inside buildings,
and purifying this inside air has become a focus of research. We use CFD simulation to analyze the
density of pollutants in the air of an indoor structure, and then we use OpenFOAM solver to analyze
indoor air flow and pollutant convection-diffusion. Cloud computing is used to visualize findings in
real time, and AR is used to present this visualization on a mobile device.
The rest of this paper is organized as follows. Section 2 examines studies using AR technology
in the field of engineering analysis and simulation, while Section 3 describes the design of an
AR-based visualization system for aerodynamics simulation. Section 4 explains implementation
of this visualization, and findings that were obtained using a cloud-computing system. Section 5
presents conclusions.

2. Related Work
AR is an effective tool to improve the quality of engineering analysis and to overcome the
limitations of VR. AR-based visualization can be displayed in an actual environment, and therefore
aids in user recognition and understanding of numerical analysis results [7]. Users can update
these results in real time and in an actual environment, so the effects of parameters on product and
technology can be analyzed immediately and effectively. Thus, AR is considered to be an excellent user
interface for engineering analysis and simulation environments embedded within physical reality [7].
These traits have enabled its wide use in various applications, such as the medical and manufacturing
fields, and in human interaction.
Trends in engineering analysis and simulation environments have emerged in three main areas:
(1) surgery, (2) manufacturing, and (3) CFD. In the field of biomedical engineering and surgery, the use
of VR in visualization of computed tomography (CT) and magnetic resonance imaging (MRI) data
are the main research topics. Tawara et al. [8] proposed a two-handed direct manipulation system to
achieve complex volume segmentation of CT/MRI data in augmented reality with a remote control
attached to a motion-tracking cube. Kaladji et al. [9] proposed the feasibility of finite element simulation
to predict arterial deformations. Sutherland et al. [10] proposed a prototype for an augmented reality
haptic simulation system with potential for training on spinal needle insertion. However, accuracy
issues in current findings and difficulty in building up the related system environment are issues that
limit its use to an educational tool.
In the field of mechanical engineering and manufacturing, studies have been addressed
AR visualization of three-dimensional (3D) models and structure analysis to enable users to
intuitively analyze and discuss CAD and other 3D design data. Weidlich et al. [11] proposed new
visualization methods for studying the results of finite elements analysis in immersive environments.
Paulus et al. [12] proposed physics-based non-rigid augmented reality. Buchau et al. [13] proposed
an application of augmented reality (AR) in the context of teaching electrodynamics. Silva et al. [14]
Sustainability 2018, 10, 1362 3 of 14

proposed AR visualization using graphical representations that were created from numerical datasets.
However, AR utilization research in the field focuses on restricted aspects, like structure analysis and
mechanical equipment, and is limited due to the difficulty in precisely matching with physical models.
Several studies (Table 1) have addressed the visualization of analysis results in mechanical,
civil, and urban engineering. AR visualization results that are based on these studies enable the
civil engineer and the urban designer to analyze and improve indoor and outdoor environments.
However, these AR visualization studies based on CFD simulation mainly concentrate on entire city
design or architectural engineering, which do not apply to the various other fields of CFD simulation.
Individual studies have visualized CFD simulation data with AR technology and diverse visualization
methods such as Java3D, OpenGL, Paraview, and VR Markup Language.
For AR visualization of complicated and precise CFD simulation data, stable AR tracking
technology is required, but most studies use a particular scenario of a restricted model, and for
that reason are relatively unsuitable for general-purpose use. In addition, to create and process the
data for AR visualization in real time, powerful computers are required, but most of the studies were
conducted on a desktop computer, so they cannot be used in practical CFD simulations that cover the
wide range of activities in each industry.

Table 1. Studies on augmented reality (AR)-based Visualization of computational fluid dynamics


(CFD) Simulation.

Visualization
Researcher Characteristics Limitations
Method
Indoor thermal data visualization in a robot Can be utilized only in a restricted
Malkawi et al. environment. indoor environment.
Java3D
[15,16] Improvement of voice and motion-based Only provided an robot
interactive environment. based environment.
AR visualization of the CFD simulation of a
housing design.
Lack of restricted outdoor environment.
Fukuda et al. OpenGL, VR Design tool consisting of CFD, VR, AR and BIM.
Lacks support in a mobile environment.
[17,18] Markup language Multiple users wearing head-mounted displays
Lacks real scene AR.
(HMD) can superimpose the results of analysis or
sensory data on video images of outdoors.
AR with CFD to develop training materials for
operation of a large boiler at a coal-fired Restricted to visualization of set
power plant. information, not calculation of
Moreland et al. [19] Paraview System uses a desktop computer. Simulation, post-processing data difficult to
post processing, modeling of associated 3D integrate with other applications.
structures, and registration with relevant Lacks real scene AR.
technical drawings.
Setup is pre-defined and is not
Regenbrecht Local image AR visualization of temperature, speed and adaptable to other applications.
et al. [20] overlay direction of air flow in the cabin, or air pressure. Lacks real scene AR.
Lacks support in a mobile environment.
Integrates interactive simulation, a marker-based
tangible user interface and several interaction
Lacks support in a mobile environment.
Niebling et al. [21] OpenGL concepts for 3D CFD.
Lacks real scene AR.
AR visualization for turbine design and
development of prototypes.

3. Design of AR-Based Visualization for Aerodynamics Simulation


To provide an effective AR-based method to visualize aerodynamics simulation, which is
a sustainable cloud computing environment (Figure 1) is implemented based on the functional
combination of a cluster group (CFD simulation, AR post-processing, AR visualization).
Sustainability 2018, 10, x FOR PEER REVIEW 4 of 14

3. Design of AR-Based Visualization for Aerodynamics Simulation


To provide an effective AR-based method to visualize aerodynamics simulation, which is a
sustainable2018,
Sustainability cloud computing
10, 1362 environment (Figure 1) is implemented based on the functional
4 of 14
combination of a cluster group (CFD simulation, AR post-processing, AR visualization).

Figure 1. Conceptual diagram of AR-based visualization for aerodynamics simulation Components


Figure 1. Conceptual diagram of AR-based visualization for aerodynamics simulation Components
and processes are described in the text.
and processes are described in the text.

A usual CFD simulation process involves three stages. (1) Pre-processing involves modeling
A usual
geometry, CFD simulation
generating process involves
a mesh representation three
of the stages. setting
geometry, (1) Pre-processing involvesconditions,
initial and boundary modeling
geometry, generating
and choosing a mesh
the solver representation
scheme. (2) Solvingof the geometry,
entails numerical setting initial
analysis and boundary
of information thatconditions,
is passed
and choosing the solver scheme. (2) Solving entails numerical analysis of
from the pre-processing stage. (3) Post-processing visualizes the results graphically. information that is passed
from In
theaerodynamics
pre-processing simulation,
stage. (3) Post-processing
pre-processing visualizes the results
and solving are graphically.
performed using the CFD
In aerodynamics simulation, pre-processing and solving are
simulation on OpenFOAM, and post-processing is performed in AR to convert performed using
thethe CFD
data to asimulation
form that
on OpenFOAM, and post-processing is performed in AR to convert the data to a form
is suitable for display on a mobile device that has relatively little computing power. CFD simulationthat is suitable
for
anddisplay on a mobile device
AR post-processing that has relatively
need significant little so
computation, computing power. CFD
they are performed on simulation
a sustainable and AR
cloud
post-processing need significant computation, so they are performed on a sustainable cloud
computing system, whereas AR visualization needs relatively little computation, and is therefore run computing
system, whereas
on the mobile AR visualization needs relatively little computation, and is therefore run on the
device.
mobile device.
3.1. Sustainable Cloud Computing Environment
3.1. Sustainable Cloud Computing Environment
A sustainable cloud computing system should meet two requirements for AR visualization.
A sustainable cloud computing system should meet two requirements for AR visualization.
(1) The system should be a combination of computing nodes with various operating systems and
(1) The system should be a combination of computing nodes with various operating systems and
hardware types. Usually, the initial system has nodes with identical operating systems and hardware
hardware types. Usually, the initial system has nodes with identical operating systems and hardware
types. However, when a computing node is added to supplement computing power, the new node
types. However, when a computing node is added to supplement computing power, the new node
may have a different operating system and hardware than the established nodes. Therefore, one goal
may have a different operating system and hardware than the established nodes. Therefore, one goal
of this paper is to establish a heterogeneous cloud computing environment to allow for the addition
of this paper is to establish a heterogeneous cloud computing environment to allow for the addition
or deletion of various computing nodes, and thereby to achieve a flexible and scalable system. As a
or deletion of various computing nodes, and thereby to achieve a flexible and scalable system. As a
result, it is possible to reduce electronic waste by ensuring system scalability.
result, it is possible to reduce electronic waste by ensuring system scalability.
(2) The system should allow for real-time processing that responds to concurrent user requests.
(2) The system should allow for real-time processing that responds to concurrent user requests.
AR visualization technology requires real-time features. Job scheduling should guarantee concurrent
AR visualization technology requires real-time features. Job scheduling should guarantee concurrent
user executions. The cloud-computing system in this study consists of heterogeneous nodes that use
user executions. The cloud-computing system in this study consists of heterogeneous nodes that use
SGE, and is designed to allow for efficient scheduling that responds to concurrent tasks. By making
SGE, and is designed to allow for efficient scheduling that responds to concurrent tasks. By making of
of effective use of computing resources, it is possible to reduce energy usage.
effective use of computing resources, it is possible to reduce energy usage.
The cloud computing system (Figure 2) for AR visualization consists of a master that manages
slave nodes that executes job requests. The master node manages the database to execute the tasks,
and manages user information and job scheduling to execute concurrent tasks. The user operates
Sustainability 2018, 10, x FOR PEER REVIEW 5 of 14
Sustainability 2018, 10, x FOR PEER REVIEW 5 of 14
The cloud
Sustainability computing
2018, 10, 1362 system (Figure 2) for AR visualization consists of a master that manages 5 of 14
slaveThe cloud
nodes thatcomputing
executes jobsystem (Figure
requests. The2)master
for ARnodevisualization
manages consists of a master
the database that the
to execute manages
tasks,
slave nodes that
and manages userexecutes job requests.
information and jobThe master node
scheduling manages
to execute the database
concurrent to The
tasks. execute
userthe tasks,
operates
and manages
various
various mobileuser
mobile information
devices
devices toconvey
to convey and
job job
job scheduling
requests
requests totothe to
the execute
master
master concurrent
node,
node, which
which tasks.
thenthen The the
user
assigns
assigns theoperates
jobsjobs
to theto
various
the mobile
execution devices
nodes. to
The convey
executionjob requests
nodes to
then the master
perform node,
CFD which
simulation then
execution nodes. The execution nodes then perform CFD simulation and AR post-processing, andandassigns
AR the jobs to
post-processing, the
execution
and
deliver thenodes.
deliver output The
the output execution
to the
to the user. nodes
user.
Execution then
Execution perform
nodes
nodes areareCFD simulation
mostly
mostly andcluster
dividedinto
divided into AR post-processing,
cluster andcloud
and and
cloud systems.
systems.
deliver
The the system
The cloud
cloud output is
system to designed
is the user. Execution
designed to
to permit nodes are
permit linking
linking withmostly
with divided
private
private into
clouds,
clouds, ascluster
as well asand
well as withcloud
with systems.
public
public cloud
cloud
The cloud
systems, system
such as is designed
Rescale, Amazon to permit
Web linking
Service,
systems, such as Rescale, Amazon Web Service, and AZURE. with
and private
AZURE. clouds, as well as with public cloud
systems, such as Rescale, Amazon Web Service, and AZURE.

Figure 2. Sustainable
Figure 2. Sustainable cloud
cloud computing
computing environment using Son
environment using Son of
of Grid
Grid Engine
Engine (SGE).
(SGE).
Figure 2. Sustainable cloud computing environment using Son of Grid Engine (SGE).
3.2. Aerodynamics Simulation Using OpenFOAM
3.2. Aerodynamics
3.2. Aerodynamics Simulation
Simulation Using
Using OpenFOAM
OpenFOAM
To conduct the aerodynamics simulation, the user selects one of 14 usual plate-type room
To conduct
To conduct the aerodynamics simulation, the user
user selects
selects one
one ofof 14
14 usual
usualinplate-type room
structures. Then, the aerodynamics
a grid of the indoor simulation,
structurethe is created using ‘blockMesh’ plate-type
OpenFOAM roomin
structures. Then,
structures. Then, aa grid
grid of of the
the indoor
indoor structure
structure is created
created using
using ‘blockMesh’
‘blockMesh’ in in OpenFOAM in
accordance with the ‘blockMeshDict’ file, which hasisthe geometric information OpenFOAM
and meshing in
methods.
accordance with
accordance the ‘blockMeshDict’
with the ‘blockMeshDict’ file, which has the geometric
geometric information
information and and meshing
meshing methods.
methods.
After the structured grid is generated,file, which
several has the
items of furniture, such as a refrigerator, sofa, and TV,
After
After the structured
the structured grid is generated, several items of furniture, such as a refrigerator, sofa,
and and
and the air purifiergrid
of isthe
generated,
.stl CADseveral itemsare
file type of furniture,
placed insuchthe as a refrigerator,
room as desiredsofa,
(Figure TV,
3).
TV, and
and the the
air air purifier
purifier of of the
the .stl .stl
CADCAD filefile
typetype areareplaced
placedininthetheroom
roomasas desired
desired (Figure
(Figure 3).
3).
‘snappyHexMesh’ generates an unstructured grid according to the settings in ‘snappyHexMeshDict’
‘snappyHexMesh’
‘snappyHexMesh’ generates an unstructured
generates an unstructured grid
grid according to the settings in ‘snappyHexMeshDict’
and draws hybrid boundary-layer meshes near theaccording
furnitureto thethe
and settings in ‘snappyHexMeshDict’
air purifier [22].
and draws
and draws hybrid
hybrid boundary-layer
boundary-layer meshes meshes nearnear thethe furniture
furniture and
and the
the air
air purifier
purifier [22].
[22].

Figure 3. Mesh generation inside housing unit and air cleaner.


Figure 3. Mesh generation inside housing unit and air cleaner.
Figure 3. Mesh generation inside housing unit and air cleaner.
In the state of setting initial/boundary conditions (Figure 4), the end faces of the room are set to
In thewall
a no-slip state of settingas
condition, initial/boundary
are the wall sides conditions
of the air(Figure
cleaner4),and
the end faces of
furniture. Thetheairroom are set
purifier to
inlet
ahas In the
no-slip state
wall of settingasinitial/boundary
condition, are the wall sides conditions
of the air (Figure
cleaner 4),
and the end facesThe
furniture. of the
air room areinlet
purifier set
a velocity as set to the specifications of the air purifier (2.87 m/s in this case). The air purifier outlet
to a ano-slip
has wall condition, as are the wall sidesairofpurifier
the air(2.87cleaner
m/sand furniture. Theair airpurifier
purifieroutlet
inlet
is set velocity as set
to 0 Pa, which to
is the
thespecifications
standard gauge of the
reading of atmospheric in this
pressure. case).
As Thethe initial conditions,
has
is a velocity as set to the specifications of the air purifier (2.87 m/s in this case). The air purifier
allset
of to
the0velocity
Pa, which is the
fields standard
besides gauge reading
the boundary of atmospheric
conditions are set to 0pressure.
m/s, andAs allthe initial
of the conditions,
pressure fields
outlet
all of is velocity
the set to 0 Pa,
fieldswhich
besidesis the
the standard
boundary gauge reading
conditions are of atmospheric
set to 0 m/s, and pressure.
all of the As the initial
pressure fields
besides the boundary conditions are set to 0 Pa. The initial mass fraction of a pollutant is set at 1.0,
conditions,
besides all of the velocity
the boundary conditions fields
arebesides
set to to0the
Pa.boundary conditions are setof atopollutant
0 m/s, and all at
of 1.0,
the
which represents 100%. Air viscosity is set 1 × The
10−5 minitial
2/s andmass fraction
pollutant diffusivity is setistoset
1.9 × 10 −5
pressure
which fields
represents besides
100%. the
Airboundary
viscosity conditions
is set to 1 ×are
10 set to 0 Pa. The initial mass fraction
−5 m2/s and pollutant diffusivity is set to 1.9 × 10−5 of a pollutant
m2/s. For turbulence, we use the k-epsilon model [22]. As the −5 mfinal
2 /s step before solving, the
mis 2set
/s. at
For1.0,turbulence,
which represents
we use 100%.
the Air viscosity
k-epsilon is set
model [22]. × 10
to 1As the final and pollutant
step before diffusivity
solving, the
OpenFOAM ‘fvSolution’ −5 m2 /s.and ‘fvScheme’ files are written to define several solvers and schemes about
is set
OpenFOAM to 1.9 × 10
‘fvSolution’ For
and turbulence,
‘fvScheme’ we
files use
are the
writtenk-epsilon
to definemodel
several [22]. As
solvers the
and final step
schemes before
about
the governing equations. The Euler implicit method is used for time discretization. The first-order
solving,
the the OpenFOAM
governing equations. ‘fvSolution’ and ‘fvScheme’ isfiles arefor
written to define several solvers and
upwind scheme is used toThe Euler velocity,
calculate implicit method
mass fraction, used time discretization.
and turbulence variables.The first-order
A convection-
schemes
upwind about the governing equations. The Euler implicit method is used for time discretization.
diffusionscheme equation is used
[22] tois calculate velocity, mass
used to calculate fraction,
the mass and of
fraction turbulence variables.
the pollutant, A convection-
concurrently with
The first-order
diffusion upwind
equation [22] scheme
is used istoused to calculate
calculate the mass velocity, mass
fraction offraction, and turbulence
the pollutant, concurrently variables.
with
Sustainability 2018, 10, 1362 6 of 14

Sustainability 2018, 10, x FOR PEER REVIEW 6 of 14


A convection-diffusion
Sustainability 2018, 10, x FORequation [22] is used to calculate the mass fraction of the pollutant, concurrently
PEER REVIEW 6 of 14
withcalculation
calculation of N-S equation [23];[23];
of N-S equation threethree equations,
equations, one forone for pressure,
pressure, one for velocity,
one for velocity, and one for and massone
calculation
for fraction
mass fraction of N-S equation
are calculated
are calculated [23]; three equations,
simultaneously
simultaneously one
using anusing for pressure,
an equation
iterative one for
iterative solver. velocity,
equation and one for
solver. OpenFOAM
OpenFOAM mass
solves the
fraction
solves are calculatedequation
the convection-diffusion
convection-diffusion simultaneously
equation using an iterative
in scalarTransportFoam
in scalarTransportFoam equation or solver.
or reactingFoam, butOpenFOAM
reactingFoam, but in
in this paper, solves
thethis the
paper,
species
theconvection-diffusion
species equation in scalarTransportFoam or reactingFoam, but
transport equation is solved by pimpleFoam, a solver for incompressible transients, using a brand- a
transport equation is solved by pimpleFoam, a solver for in
incompressible this paper, the
transients, species
using
transport
new solver
brand-new equation
called
solver is
called solved by pimpleFoam,
“speciesPimpleFoam”
“speciesPimpleFoam” [24].a[24].
solver
The Thefor
time incompressible
step
time is 0.1iss,0.1
step totransients,
so s, calculate
so using15,000
15,000
to calculate asteps,
brand- the
steps,
new solver
the study examines called
examinesthe “speciesPimpleFoam”
theeffect [24].
effectofofairairpurification
purification The
over time
25 25
over min.step is
When
min. 0.1
When s, so
the the to calculate
governing
governing 15,000
equation steps,
is solved,
equation the
the
is solved,
study examines
convergence andthe effect
accuracy of air
of purification
the results over
vary 25 min.
depending When
on the
the convergence and accuracy of the results vary depending on the numerical scheme [25]. the governing
numerical equation
scheme. is
[25]. solved, the
convergence and accuracy of the results vary depending on the numerical scheme. [25].

Figure
Figure 4. 4. Boundarycondition
Boundary conditionsetting.
setting.
Figure 4. Boundary condition setting.
3.3. AR Post-Processing for Lightweight Data
3.3.3.3.
ARARPost-Processing
Post-Processingfor
forLightweight
LightweightData
Data
The cloud-based post-processing approach (Figure 5) supports quick calculation and analysis of
The cloud-based
The cloud-based post-processing
post-processing approach
approach (Figure 5) supports
(Figure 5) supportsquick
quickcalculation
calculationand
and analysis
analysis of of
large, complex aerodynamics simulation data. Particularly, this paper presents an approach to reduce
large,
thecomplex
large, complex
volume aerodynamics
ofaerodynamics simulation
AR data.
simulation
data so that optimal data. Particularly,
visualization can run thison
this paper
paper presents
presents
a mobile ananapproach
device. approachto to reduce
reduce
thethe
volume
volumeofof
data sosothat
data thatoptimal
optimalAR
ARvisualization
visualization can run on on aa mobile
mobiledevice.
device.

Figure 5. Conceptual diagram of AR post-processing for AR visualization.


Figure 5. Conceptual diagram of AR post-processing for AR visualization.
Figure 5. Conceptual diagram of AR post-processing for AR visualization.
Post-processing (Figure 6) extracts elements, such as vertices, edges, curves, surfaces, vectors,
Post-processing
and scalars from CFD (Figure 6) extracts
analysis data, and elements, such as
then converts vertices,
them to VTKedges, curves,
format surfaces,
(ASCII); then, tovectors,
reduce
and Post-processing
scalars from CFD(Figure 6)
analysis extracts
data, andelements,
then such
converts as
them vertices,
to VTK edges,
format curves,
(ASCII);
data size and analysis time, the ASCII data are converted to binary form. The converted binary
surfaces,
then, to vectors,
reduce 3D
and fluid space mesh data are parsed to find components, such as vertices, edges, and curves, thenreduce
scalars
data size from
and CFD analysis
analysis time, data,
the and
ASCII then
data converts
are them
converted to to VTK
binary format
form. The (ASCII); then,
converted to
binary 3D
fluid
data sizespace
fluid
analysisanddata
analysis
meshsuch time,
dataasare the ASCII
parsed
vectors data components,
toscalars
and find are converted
are toand
such
extracted binary
asdata form.
vertices, The converted
edges,
are created and binary
curves, then3D
for visualization. fluid
fluid
Next,
analysis data such as vectors and scalars are extracted and data are created for visualization. Next,
Sustainability 2018, 10, 1362 7 of 14

space mesh data are parsed to find components, such as vertices, edges, and curves, then fluid analysis
Sustainability
data such as 2018, 10, x and
vectors FOR PEER REVIEW
scalars are extracted and data are created for visualization. Next, the7 of 14
vector
field is calculated from the 3D fluid space mesh and CFD analysis data to generate the data from the
the vector field is calculated from the 3D fluid space mesh and CFD analysis data to generate the data
results analysis. To support the automatic calculation of seed point, its initial position is generated
from the results analysis. To support the automatic calculation of seed point, its initial position is
to represent the attention zone of the CFD analysis data in the calculated vector field. The starting
generated to represent the attention zone of the CFD analysis data in the calculated vector field. The
position of fluid flow is stored using seed point and mesh data, then post-processor data, such as
starting position of fluid flow is stored using seed point and mesh data, then post-processor data,
Streamline and Particle, are created. Finally, AR visualization data are created by shading and mapping
such as Streamline and Particle, are created. Finally, AR visualization data are created by shading
theand
3Dmapping
fluid space themesh, vector,
3D fluid spaceand scalar
mesh, data.
vector, and scalar data.

Figure 6. AR post-processing procedure.


Figure 6. AR post-processing procedure.

We present a simple but effect optimization method to visualize results analysis data in the
We present
mobile a simple but
AR environment. effect optimization
Visualization of 3D fluidmethod to visualize
space mesh results
data in the formanalysis
of voxels data in the
is very
mobile AR environment.
intensive in computationVisualization
and memory.of 3D fluid
Results spacedata
analysis mesh data in the
(streamline, form of
particle) arevoxels is very
calculated
intensive in computation
using vertices and
of 3D fluid memory.
space Results
data. The analysis data
optimization step (streamline, particle) the
(Figure 7) simplifies are 3D
calculated using
fluid space
mesh data. In the simplification step, these data must be reduced; for this process,
vertices of 3D fluid space data. The optimization step (Figure 7) simplifies the 3D fluid space mesh we adopt the
MeshLab
data. In the library [26]. Then,
simplification wethese
step, generate
datavoxels
must out of the datafor
be reduced; (vertex, curve, or
this process, surface
we adoptobjects). We
the MeshLab
use Collision
library Detection
[26]. Then, to deletevoxels
we generate duplicated
out ofvoxel elements.
the data We curve,
(vertex, can generate light-weight
or surface objects).results
We use
analysisDetection
Collision data (streamline,
to deleteparticle) fromvoxel
duplicated the reduced
elements.voxels.
We canFinally, the lightweight
generate light-weight results
resultsanalysis
analysis
data are created and are sent to the user’s mobile AR device for smooth visualization.
data (streamline, particle) from the reduced voxels. Finally, the lightweight results analysis data are
created and are sent to the user’s mobile AR device for smooth visualization.
3.4. AR Tracking & Visualization
3.4. ARWe Tracking & Visualization
used environment-based tracking technology with a mobile device camera to conduct AR
visualization and show the air flow
We used environment-based tracking inside the house. Environment-based
technology tracking
with a mobile device technology
camera has theAR
to conduct
advantage that it enables the user to review CFD results accurately and in immersive
visualization and show the air flow inside the house. Environment-based tracking technology fashion, evenhas
in a large and complex space. Additionally, the cloud system is designed to effectively perform real-
the advantage that it enables the user to review CFD results accurately and in immersive fashion,
time tracking, which requires significant computation. Therefore, the cloud system performs the
even in a large and complex space. Additionally, the cloud system is designed to effectively
computationally-demanding target tracking process, then the user device performs environment
perform real-time tracking, which requires significant computation. Therefore, the cloud system
registration.
performs the computationally-demanding target tracking process, then the user device performs
environment registration.
Sustainability 2018, 10, 1362 8 of 14
Sustainability 2018, 10, x FOR PEER REVIEW 8 of 14

Sustainability 2018, 10, x FOR PEER REVIEW 8 of 14

Figure 7. Reducing size of results analysis data.


Figure 7. Reducing size of results analysis data.
Cloud-based real-time AR tracking is comprised of an AR display, an AR mapper, and an AR
Cloud-based
tracker (Figurereal-time
8). The AR ARFigure
display 7. Reducing size of results analysis data.
tracking is comprised
receives real-timeof an AR
image datadisplay,
from thean user’s
AR mapper,
mobile and
devicean AR
camera,
tracker (Figure and then
8). The transmits
AR the
display image
receivesRGB data to
real-time the AR
image tracker
data across
from a
the wireless
user’s network.
mobile The
device AR
camera,
Cloud-based real-time AR tracking is comprised of an AR display, an AR mapper, and an AR
tracker
and then then usesthe
transmits theimage
FAST feature technique
to the[27] to extractacross
features from RGBnetwork.
data, and in this way,
tracker (Figure 8). The ARRGB data
display receives AR tracker
real-time image data a wireless
from the user’s mobile The AR tracker
device
it creates a 3D space in which the simulation results are displayed between the camera and the real
then uses
camera, theandFASTthenfeature technique
transmits the image [27]
RGBto extract features
data to the fromacross
AR tracker RGB data, and network.
a wireless in this way,TheitAR
creates
environment space. The AR mapper matches the simulation results with the 3D space data created
a 3D tracker
space inthen which
usesthethe simulation
FAST feature results are displayed
technique between
[27] to extract thefrom
features camera
RGB anddata,the
andreal environment
in this way,
by the AR tracker. Quick movements of the camera that are caused by the user may end up “shaking”
space.itanThe
creates
ARamapper
3D spacematches
in whichthe thesimulation
simulation results
results are
withdisplayed between
the 3D space data the
augmented object, but this artifact can be corrected using the Gauss-Newton technique [28]. The
cameraby
created andthethe
ARreal
tracker.
environment
QuickAR movements space.
of theThe AR mapper matches the simulation results with the 3D space data created
display displays thecamera thatresults
simulation are caused by the
on a screen user
using 3Dmay
spaceend up “shaking”
information (Figure an augmented
9). The AR
bybut
object, the AR
this tracker.
artifact Quick
can movements
be corrected of the camera
using the that are caused by the
Gauss-Newton user may[28].
technique end upThe“shaking”
ARthen
display
display converts data structure to reference structure by using the reference function, and
an augmented object, but this artifact can be corrected using the Gauss-Newton technique [28]. The
displays
appliesthethesimulation
Draw Call results on a screen
minimization methodusing 3D space
by component information
batching, thereby(Figure
minimizing9). The AR display
the garbage
AR display displays the simulation results on a screen using 3D space information (Figure 9). The AR
collector
converts datacall and as ato
structure result, effectively
reference using by
structure theusing
memory thethat is needed
reference for rendering.
function, In this
and then way, the
applies
display converts data structure to reference structure by using the reference function, and then
tracking
Drawapplies of the physical
Call minimization space by
method cancomponent
be done quickly,
batching, andthereby
AR visualization
minimizing of the
the aerodynamics
garbage collector
the Draw Call minimization method by component batching, thereby minimizing the garbage
simulation
call and results
as a result, can be implemented naturally.
collector call andeffectively
as a result,using the memory
effectively using thethat is needed
memory that isfor rendering.
needed In this way,
for rendering. In thistracking
way, of
the physical space can be done quickly, and AR visualization of the aerodynamics
tracking of the physical space can be done quickly, and AR visualization of the aerodynamics simulation results
can be implemented
simulation resultsnaturally.
can be implemented naturally.

Figure 8. Cloud-based AR tracking.

Figure8.8.Cloud-based
Figure Cloud-based AR
ARtracking.
tracking.
Sustainability 2018, 10, 1362 9 of 14
Sustainability 2018, 10, x FOR PEER REVIEW 9 of 14

Figure 9. AR display process.


Figure 9. AR display process.
4. Implementation Results
4. Implementation Results
This study applies AR visualization technology to help users intuitively understand the results
This study applies
of aerodynamics AR visualization
simulation. technology
The researcher puts an to
air help users
purifier intuitively
inside a housingunderstand the an
unit, conducts results
aerodynamics simulation, and then implements AR visualization of the results
of aerodynamics simulation. The researcher puts an air purifier inside a housing unit, conducts anwith a smart
phone/padsimulation,
aerodynamics or wearableand
device.
thenWe have implemented
implements the augmented
AR visualization of themodel to with
results examine in detail
a smart the
phone/pad
housing device.
or wearable structureWemodel, the indoor physical
have implemented space, and model
the augmented the mapped specific
to examine zone. The
in detail AR
the housing
visualization uses a mobile device or a wearable device (Table 2).
structure model, the indoor physical space, and the mapped specific zone. The AR visualization uses a
mobile device or a wearable device (Table
Table 2. AR2).
Display Devices & Functions.

Implementation
Table 2. AR Display Devices & Functions.
Device Type Function
Device
Samsung Galaxy 7 Visualization of results of the entire indoor housing space
Mobile
Device Type Implementation Device Function
Samsung Galaxy Visualization of analysis data like Streamline, Particle, Volume and
Device Visualization of results of the entire indoor housing space
Note animation
Samsung Galaxy 7 depending on time
Mobile Device Visualization of analysis data like Streamline, Particle,
Samsung Galaxy Note
Visualization of physical space ofdepending
real apt. and interpretation results
Volume and animation on time
Wearable through mapping
Microsoft HoloLens Visualization of physical space of real apt. and interpretation
AR Device Visualization of analysis data like Streamline, Particle, Volume and
Wearable results through mapping
Microsoft HoloLens
animation depending onoftime
AR Device Visualization analysis data like Streamline, Particle,
Volume and animation depending on time
4.1. Sustainable Cloud Computing Environment
4.1. Sustainable Cloud Computing
We established Environmentenvironment by constructing a cloud test bed. The
a cloud-computing
environment consists of a master node unit, a 2-unit cluster group node, and 4-unit cloud group node
We established a cloud-computing environment by constructing a cloud test bed. The environment
(Table 3). Resource management of the node is performed by the SGE. The master node uses SGE to
consists of a master node unit, a 2-unit cluster group node, and 4-unit cloud group node (Table 3).
manage all resources and user tasks. The execution node calculates the real tasks and is divided into
Resource management
cluster of the
type and cloud type.node is performed
The sustainable by computing
cloud the SGE. The master node
environment uses SGE and
is expandable, to manage
it can all
resources and user tasks. The execution node calculates
use different operating systems and hardware at each node. the real tasks and is divided into cluster type
and cloud type. The sustainable cloud computing environment is expandable, and it can use different
Table 3. Node
operating systems and hardware Specifications
at each node. for Sustainable Cloud Computing.
Node OS CPU/Core RAM
Master Table 3. NodeLinux
K1647P0 Specifications
Mint 17.3 Rosafor Sustainablei7-3770
Cloud3.40GHz
Computing.× 1ea/8Core 16 GB
Execution K0889P0 Linux Mint 17.3 Rosa X5570 2.93GHz × 2ea/16Core 64 GB
(Cluster) NodeK0889P3 OS
Ubuntu 16.04.3 LTS CPU/Core
E5-2609 v2 2.5GHz × 2ea/8Core RAM 64 GB
Master Node01
K1647P0Red Hat Enterprise
Linux Mint Linux Server 6.7 E5-2697
17.3 Rosa × 1ea/8Core
v3 2.6GHz
i7-3770 3.40GHz × 2ea/28Core16 GB
96 GB
Execution Node02
Execution K0889P0Red Hat Enterprise
Linux Mint Linux
17.3 Rosa X5570 2.93GHz
Server 6.7 E5-2697 × 2ea/16Core
v3 2.6GHz × 2ea/28Core64 GB
96 GB
(Cluster) K0889P3 Ubuntu 16.04.3 LTS E5-2609 v2 2.5GHz × 2ea/8Core 64 GB
(Cloud) Blade01 Red Hat Enterprise Linux Server 6.7 E5-2670 2.6GHz × 2ea/16Core 16 GB
Node01 Red Hat Enterprise Linux Server
Blade02 Red Hat Enterprise Linux Server 6.7 6.7 E5-2697 v3 2.6GHz × 2ea/28Core
E5-2670 2.6GHz × 2ea/16Core 96 GB
16 GB
Execution Node02 Red Hat Enterprise Linux Server 6.7 E5-2697 v3 2.6GHz × 2ea/28Core 96 GB
(Cloud) Blade01 Red Hat Enterprise Linux Server 6.7 E5-2670 2.6GHz × 2ea/16Core 16 GB
The SGE directory
Blade02of the
Redmaster node Linux
Hat Enterprise is theServer
default
6.7 and is set2.6GHz
E5-2670 as a Network File
× 2ea/16Core (NFS)System
16 GB
share file system to be accessed from all cloud environment nodes. If the NFS is a large cluster or has
high throughput, considerable traffic occurs in the NFS itself, but if it is <100 nodes, or if it has low
The SGE directory of the master node is the default and is set as a Network File System (NFS)
throughput, this problem does not occur [29]. In addition, traffic can be resolved by changing the
share file system to be accessed from all cloud environment nodes. If the NFS is a large cluster or has
high throughput, considerable traffic occurs in the NFS itself, but if it is <100 nodes, or if it has low
throughput, this problem does not occur [29]. In addition, traffic can be resolved by changing the
whole grid engine share file setting. The cloud environment in our study consists of reliable nodes.
Sustainability
Sustainability 2018,
2018, 10,
10, x FOR PEER REVIEW
1362 10
10 of
of 14
14

whole grid engine share file setting. The cloud environment in our study consists of reliable nodes.
Thus, default spool directories, executable files, and configuration files can be used, rather than local
ones. The
The advantages
advantages of NFS in SGE are easy setting and convenience of upgrading and debugging.

4.2. Aerodynamics
4.2. Aerodynamics Simulation
Simulation
The results
The results of of anan indoor
indoor air air purification
purification simulation
simulation were were visualized
visualized (Figure
(Figure10)10)using
usingParaview,
Paraview,
an open-source
an open-source post-processor
post-processor
Sustainability tool.
2018, 10, x FOR PEERtool.
REVIEW Paraview supports the visualization
Paraview supports the visualization of OpenFOAM’s of OpenFOAM’s 10 of 14 analysis
analysis
result format,
result format, and and in in our
our study,
study, performs
performs this this visualization
visualization with with itsits volume
volume rendering
rendering and and streamline
streamline
whole grid engine share file setting. The cloud environment in our study consists of reliable nodes.
functions [30].
functions [30].The Themass mass fraction of the
fraction of pollutant
the pollutant decreased over time
decreased over(Figure 10a–c). At the beginning,
Thus, default spool directories, executable files, and configuration files can betime
used, (Figure
rather than10a–c).
local At the
almost 100%
beginning, ones. of
almost
The the volume
100%
advantages was
ofofthe red
NFSvolume
in SGE(saturated
arewas
easy red with
setting pollutants),
(saturated
and withofbut
convenience most gradually
pollutants),
upgrading andbut most
debugging.became blue
gradually
(clear of blue
became pollutants)
(clear as of the air purification
pollutants) as the reduced the air pollution.
air purification reduced The the fluid flow changed
air pollution. The depending
fluid flow
on the amount4.2. Aerodynamics
of time Simulation
lapse. The streamline also changed overalsotime (Figureover 10d–f).
changed depending on the amount of time lapse. The streamline changed timeFlow
(Figureinitiates
10d–
at the
f). Flow The
airinitiates results
purifieratinlet of
the and an indoor air
outlet, and
air purifier purification
inletthen simulation
and spreads
outlet, and were
through visualized
the room.
then spreads (Figure 10) using
Meanwhile,
through Paraview,
massMeanwhile,
the room. fraction of
an open-source post-processor tool. Paraview supports the visualization of OpenFOAM’s analysis
pollutant
mass in
fraction theofair changed
pollutant inover time
the air in the entire
changed over timeinsidein space (Figure
the entire 11).space
inside The effect of the
(Figure 11).airThepurifier
effect
result format, and in our study, performs this visualization with its volume rendering and streamline
occurred throughout
of the air purifier
functions occurred the room
[30]. The mass and
throughout the
fraction ofamount
thethe
room of pollutant
and the
pollutant decreased.
amount
decreased overoftime
pollutant
(Figure decreased.
10a–c). At the
beginning, almost 100% of the volume was red (saturated with pollutants), but most gradually
became blue (clear of pollutants) as the air purification reduced the air pollution. The fluid flow
changed depending on the amount of time lapse. The streamline also changed over time (Figure 10d–
f). Flow initiates at the air purifier inlet and outlet, and then spreads through the room. Meanwhile,
mass fraction of pollutant in the air changed over time in the entire inside space (Figure 11). The effect
of the air purifier occurred throughout the room and the amount of pollutant decreased.

(a) (b) (c)

(a) (b) (c)

(d) (e) (f)


Figure 10. Mass fraction(d) of pollutant and streamlines; (e) (a) Mass fraction of pollutant (f) at 3 s; (b) Mass
Figure 10. Mass fraction of pollutant and streamlines; (a) Mass fraction of pollutant at 3 s; (b) Mass
fraction ofFigure
pollutant
10. Massatfraction
300 s;of(c) Massand
pollutant fraction of pollutant
streamlines; (a) Mass at 1500
fraction of s; (d) Streamlines
pollutant at 3 s; (b) Massat 3 s; (e)
fraction of pollutant at 300 s; (c) Mass fraction of pollutant at 1500 s; (d) Streamlines at 3 s; (e) Streamlines
Streamlines at 300ofs;pollutant
fraction (f) Streamlines at Mass
at 300 s; (c) 1500fraction
s. of pollutant at 1500 s; (d) Streamlines at 3 s; (e)
at 300 s; (f) Streamlines at 1500 s.
Streamlines at 300 s; (f) Streamlines at 1500 s.

Figure 11. Change of mass fraction of pollutant as time lapses.


Figure 11. Change of mass fraction of pollutant as time lapses.

Figure 11. Change of mass fraction of pollutant as time lapses.


Sustainability 2018, 10, x FOR PEER REVIEW 11 of 14
Sustainability 2018, 10, 1362 11 of 14

4.3. AR Visualization
Sustainability 2018, 10, x FOR PEER REVIEW 11 of 14
4.3. AR
This Visualization
study2018,
Sustainability used10, xthe
FORUnity engine to render the simulation results and an AR display device;
PEER REVIEW 11 of 14 the
intention
4.3. AR
This was toused
offerthe
Visualization
study an accurate
Unity engine and immersive
to render the environment
simulation for the user.
results and an The AR display
display composition
device; the
for 4.3.visualization
AR AR Visualization (Figure 12) allows for the user to review the aerodynamics simulation results in
intention Thiswas study
to offer an accurate and immersive environment for the
used the Unity engine to render the simulation results and an AR display device; theuser. The display composition
an
for AR environment
This study by theusing 12)an Android mobile todevice
reviewthat runs and the App that thedevice;
Unity engine
ARintention was toused
visualization (Figure
offer an Unity engine
allows
accurate to
andfor render
the
immersiveuser the simulation
environment theresultsthe user.an
aerodynamics
for TheARdisplay
display
simulation the
results
composition in an
exports, or
intention
AR environment by wasmeans
to offer
by using
for AR visualization of anHoloLens
accurate
an Android
(Figure smart
and
12) allows mobile glasses.
immersive
for thedevice AR
user tothat visualization
environment runs
review for
thethe the displays
user. The
App that the
aerodynamics results
display for the
composition
Unity engine
simulation whole
resultsexports,
in
for AR
interior AR visualization
space (Figure 12)an allows for the user to review theruns
aerodynamics simulation results in
or byan means ofand for local
environment
HoloLens spaces
bysmart
using that were
Android
glasses. obtained
mobile
AR visualization by mapping
device that
displays ofthe
theApp
results real andthe
forthat
the virtual
Unity
whole air purifiers
engine
interior space
in an AR
simulation.
exports, environment
or The
by results
means by ofusing
for the an
HoloLens Android
wholesmart spacemobile
(Figure
glasses. device
AR 13)that
firstruns
visualization the
recognize App
displays that
the the
rapid
results Unity engine
prototype
for the whole (RP),
and for local spaces that were obtained by mapping of the real and virtual air purifiers in simulation.
exports, or byandmeans of HoloLens smart glasses. ARbyvisualization displays results forair
the whole
and then
The results match
interior space
for the
theand simulation
wholefor local
space results
spaces
(Figurethat with
were the RP
obtained
13) first recognizemodel
mapping and
the rapid ofvisualize
the real
prototype it.
and Visualized
virtual
(RP), air simulation
purifiers
andpurifiers
then match
interior
in (Figurespace
simulation. The for local spaces
the that were obtained by13)
mapping of the real and virtual
results
the simulation 14) fromresults
results the the
with
for
air purifier
RP
whole
model zone space
and were (Figure
obtained
visualize it.
first
by recognize the
puttingsimulation
Visualized the rapid
virtual prototype
air cleaner
results
(RP),
(Figuremodel
in
andsimulation.
then match Thethe results for theresults
simulation whole withspacethe (Figure
RP 13) first
model and recognize
visualize the
it. rapid prototype
Visualized (RP), 14)
simulation
on the
fromand position
the then of
air purifier the
match14)
actual
zone
the
air purifier.
were obtained
simulation
As
resultsby
a final
putting
with the RP
analysis
the result,
virtual
model and
the
airvisualizeAR
cleaner it. display
model runs at 15
on thesimulation
Visualized
frame/per
position of the
second.results
Due (Figure
to the14) from
intuitive the
andairautomated
purifier zone were
workflow, obtained
and by by
to putting theprocess
seamless virtual air cleaner model
iterations, the use of
actual air
results purifier.
(Figure As a final
from theanalysis
air result,
purifier zone the
wereAR display
obtained runs at
putting 15 frame/per
the
on the position of the actual air purifier. As a final analysis result, the AR display runs at 15 frame/per virtual airsecond.
cleaner Due
model to the
the application
on the
intuitive position
and doesof not
automated require
the actual airexpert
workflow, and knowledge
purifier. toAsseamless of
a final analysissimulations.
process result, the AR display
iterations, the useruns of theat 15 frame/per does
application
second. Due to the intuitive and automated workflow, and to seamless process iterations, the use of
second.
not require Due to knowledge
expert
the application the intuitive
does and
of
not require automated
simulations.workflow,
expert knowledge and to seamless process iterations, the use of
of simulations.
the application does not require expert knowledge of simulations.

Figure12.
Figure 12.AR
ARdisplay
AR display implementation.
display implementation.
implementation.
Figure 12. AR display implementation.

(a) (b) (c)


(a) (b) (c)
Figure (a) (b)
13. Whole layout of smartphone/smart device-based (c)
post-processing; (a) RP Visualization; (b)
Figure 13. Whole layout of smartphone/smart device-based post-processing; (a) RP Visualization; (b)
Streamline Visualization; (c) Particle Visualization.
Figure
Figure Whole
13. Whole
13. layoutofofsmartphone/smart
layout smartphone/smart device-based
device-based post-processing;
post-processing; (a) Visualization;
(a) RP RP Visualization;
(b)
Streamline Visualization; (c) Particle Visualization.
(b) Streamline
Streamline Visualization;
Visualization; (c) Particle
(c) Particle Visualization.
Visualization.

(a) (b) (c)


(a) (b) (c)
Figure 14. Wearable AR-based post-processing result of air purifier and environment ; (a) Menu; (b)
Figure
Particle14.
(a) Wearable AR-based
Visualization; post-processing
(c) Streamline result
(b) of air purifier and environment ; (a) (c)
Visualization. Menu; (b)
Particle Visualization; (c) Streamline Visualization.
Figure
Figure 14.
14. Wearable
WearableAR-based
AR-basedpost-processing
post-processingresult of of
result airair
purifier andand
purifier environment ; (a); Menu;
environment (b)
(a) Menu;
Particle Visualization; (c) Streamline Visualization.
(b) Particle Visualization; (c) Streamline Visualization.
Sustainability 2018, 10, 1362 12 of 14

5. Conclusions and Future Work


We have designed and implemented a system that uses using AR technology to review CFD
simulation results. Our system is largely made up of CFD simulation, AR post-processing, and AR
visualization so that the significant amount of computation needed is done in a cloud-computing
environment with a view to ensuring real-time visualization. The cloud environment consists of a
master node, cluster nodes, and cloud nodes, and can effectively use SGE to link heterogeneous nodes,
and can process concurrent tasks from the user. The aerodynamics simulation of the air purification
inside a house used OpenFOAM for CFD analysis. The simulation results were reprocessed to reduce
the volume of data by AR post-processing to allow for the simulation results to be visualized on a
mobile device that has relatively low computational capability. Then AR tracking, which uses an
environment-based tracking technology, is conducted so the user can review CFD analysis results
in accurately and in an immersive fashion even in large, complex spaces. Finally, the simulation
results of the indoor air purification are displayed on a mobile device, tablet, or HoloLens by means of
Unity-based AR visualization. We plan to conduct further studies to implement AR visualization of
CFD simulation in a variety of fields, validate its accuracy, and improve the real-time features. We have
demonstrate that an efficient service that uses little energy can be assembled using the cloud, based on
heterogeneous nodes. In the future, we will implement real-time AR visualization for CFD simulations
in a variety of fields, and verify its usefulness and the performance of the proposed method through
collaboration with CFD experts and a usability evaluation.

Author Contributions: All authors contributed extensively to the work presented in this paper. Myungil Kim
devised the idea and developed the approach. Dongwoo Seo and Sangjin Park implemented the AR
post-processing and AR visualization. Daeyoung Jung built the sustainable cloud computing environment.
Sukkeun Yi performed the aerodynamics simulation and implemented the CFD simulator using OpenFOAM.
All authors have participated in writing this paper.
Acknowledgments: This work was supported by the Institute for Information & Communications Technology
Promotion (IITP) through a grant funded by the Korean government (Ministry of Science & ICT) (No. 2017-0-00350,
Shape Pattern Modeling Method based on 3D CAD and Integration Analysis Platform Development for
Manufacturing Engineering) and the Korea Institute of Science and Technology Information (KISTI).
Conflicts of Interest: The authors have no conflict of interest.

References
1. Sinclair, P. Integrating Hypermedia Techniques in Augmented Reality Environments. Ph.D. Thesis,
University of Southampton, Southampton, UK, 28 May 2007.
2. Azuma, R. Overview of Augmented Reality. In Proceedings of the SIGGRAPH International Conference on
Computer Graphics and Interactive Techniques, Los Angeles, CA, USA, 8–12 August 2004.
3. Ahmad, R.W.; Gani, A.; Hamid, S.H.A.; Shojafar, M.; Ahmed, A.I.A.; Madani, S.A.; Saleem, K.; Rodrigues, J.J.
A survey on energy estimation and power modeling schemes for smartphone applications. Int. J.
Commun. Syst. 2017, 30, 1–22. [CrossRef]
4. Naranjo, P.G.; Pooranian, Z.; Shojafar, M.; Conti, M.; Buyya, R. FOCAN: A Fog-supported Smart City Network
Architecture for Management of Applications in the Internet of Everything Environments. Available online:
https://arxiv.org/pdf/1710.01801.pdf (accessed on 30 March 2018).
5. Dastbaz, M.; Pattinson, C.; Akhgar, B. A Green Information Technology: A Sustainable Approach; Elsevier:
New York, NY, USA, 2014; pp. 95–110, ISBN 978-0-12-801379-3.
6. Son of Grid Engine. Available online: https://arc.liv.ac.uk/trac/SGE (accessed on 30 March 2018).
7. Li, W.; Nee, A.Y.C.; Ong, S.K. A State-of-the-Art Review of Augmented Reality in Engineering Analysis and
Simulation. Multimodal Technol. Interact. 2017, 1, 17. [CrossRef]
8. Tawara, T.; Ono, K. A framework for volume segmentation and visualization using augmented reality.
In Proceedings of the 2010 IEEE Symposium on 3D User Interface (3DUI), Westin Waltham-Boston Waltham,
MA, USA, 20–21 March 2010; pp. 20–21.
Sustainability 2018, 10, 1362 13 of 14

9. Kaladji, A.; Dumenil, A.; Castro, M.; Cardon, A.; Becquemin, J.P.; Bou-Saïd, B.; Haigron, P. Prediction of
deformations during endovascular aortic aneurysm repair using finite element simulation. Comput. Med.
Imaging Graph. 2013, 37, 142–149. [CrossRef] [PubMed]
10. Sutherland, C.; Hashtrudi-Zaad, K.; Sellens, R.; Abolmaesumi, P.; Mousavi, P. An augmented reality haptic
training simulator for spinal needle procedures. IEEE Trans. Biomed. Eng. 2013, 60, 3009–3018. [CrossRef]
[PubMed]
11. Weidlich, D.; Scherer, S.; Wabner, M. Analyses using VR/AR visualization. IEEE Comput. Graph. Appl. 2008,
28, 84–86. [CrossRef] [PubMed]
12. Paulus, C.J.; Haouchine, N.; Cazier, D.; Cotin, S. Augmented reality during cutting and tearing of deformable
objects. In Proceedings of the 2015 IEEE International Symposium on Mixed and Augmented Reality
(ISMAR), Fukuoka, Japan, 29 September–3 October 2015.
13. Buchau, A.; Rucker, W.M.; Wössner, U.; Becker, M. Augmented reality in teaching of electrodynamics. Int. J.
Comput. Math. Electr. Electron. Eng. 2009, 28, 948–963. [CrossRef]
14. Silva, R.L.; Rodrigues, P.S.; Oliveira, J.C.; Giraldi, G. Augmented Reality for Scientific Visualization: Bringing
Datasets inside the Real World. In Proceedings of the Summer Computer Simulation Conference (SCSC
2004), Montreal, CA, USA, 20–24 July 2004.
15. Lakaemper, R.; Malkawi, A.M. Integrating robot mapping and augmented building simulation. J. Comput.
Civ. Eng. 2009, 23, 384–390. [CrossRef]
16. Malkawi, A.M.; Srinivasan, R.S. A new paradigm for Human-Building Interaction: The use of CFD and
Augmented Reality. Autom. Constr. 2005, 14, 71–84. [CrossRef]
17. Fukuda, T.; Mori, K.; Imaizumi, J. Integration of CFD, VR, AR and BIM for design feedback in a design
process-an experimental study. In Proceedings of the 33rd International Conference on Education and
Research in Computer Aided Architectural Design Europe (eCAADe33), Oulu, Finland, 22–26 August 2015.
18. Yabuki, N.; Furubayashi, S.; Hamada, Y.; Fukuda, T. Collaborative visualization of environmental simulation
result and sensing data using augmented reality. In Proceedings of the International Conference on
Cooperative Design, Visualization and Engineering, Osaka, Japan, 2–5 September 2012.
19. Moreland, J.; Wang, J.; Liu, Y.; Li, F.; Shen, L.; Wu, B.; Zhou, C. Integration of Augmented Reality with
Computational Fluid Dynamics for Power Plant Training. In Proceedings of the International Conference on
Modeling, Simulation and Visualization Methods, Las Vegas, NE, USA, 22–25 July 2013.
20. Regenbrecht, H.; Baratoff, G.; Wilke, W. Augmented reality projects in the automotive and aerospace
industries. IEEE Comput. Graph. Appl. 2005, 25, 48–56. [CrossRef] [PubMed]
21. Niebling, F.; Griesser, R.; Woessner, U. Using Augmented Reality and Interactive Simulations to Realize
Hybrid Prototypes. Available online: https://www.researchgate.net/profile/Uwe_Woessner/publication/
220844660_Using_Augmented_Reality_and_Interactive_Simulations_to_Realize_Hybrid_Prototypes/
links/0c96052a9c0905da4e000000.pdf (accessed on 30 March 2018).
22. Jasak, H.; Jemcov, A.; Tukovic, Z. OpenFOAM: A C++ library for complex physics simulations. In Proceedings
of the International Workshop on Coupled Methods in Numerical Dynamics, Dubrovnik, Croatia,
19–21 September 2007.
23. Kays, W.M.; Michael, E.C. Convective Heat and Mass Transfer, 3rd ed.; Jenson Books: Logan, UT, USA, 2005;
pp. 50–210, ISBN 0070337217.
24. Jang, D.S.; Jetli, R.; Acharya, S. Comparison of the PISO, SIMPLER, and SIMPLEC algorithms for the
treatment of pressure-velocity coupling in steady flow problems. Numer. Heat Transf. Part A Appl. 1986, 10,
209–228.
25. Versteeg, H.K.; Malalasekera, W. An Introduction to Computational Fluid Dynamics: The Finite Volume Method;
Pearson Education: New York, NY, USA, 1995; pp. 40–130, ISBN 0131274988.
26. MeshLab Library. Available online: http://www.meshlab.net/ (accessed on 30 March 2018).
27. Rosten, E.; Drummond, T. Machine learning for high-speed corner detection. In Proceedings of the European
conference on Computer Vision—ECCV, Graz, Austria, 7–13 May 2006.
28. Sarkis, M.; Diepold, K. Camera Pose Estimation via Projective Newton Optimization on the Manifold.
IEEE Trans. Image Process. 2012, 21, 1729–1741. [CrossRef] [PubMed]
Sustainability 2018, 10, 1362 14 of 14

29. Reducing and Eliminating NFS Usage by Grid Engine. Available online: http://arc.liv.ac.uk/SGE/howto/
nfsreduce.html (accessed on 30 March 2018).
30. Ayachit, U. The Paraview Guide: A Parallel Visualization Application; Kitware: New York, NY, USA, 2015;
pp. 20–150, ISBN 1930934300 9781930934306.

© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).

You might also like