You are on page 1of 14

Virtual Reality & Intelligent Hardware 2019 Vol 1 Issue 6:597—610

·Review·

Digital assembly technology based on augmented reality


and digital twins: a review
Chan QIU, Shien ZHOU, Zhenyu LIU*, Qi GAO, Jianrong TAN
State Key Laboratory of CAD and CG, Zhejiang University, Hangzhou 310027, China

* Corresponding author, liuzy@zju.edu.cn


Received: 31 July 2019 Accepted: 28 October 2019

Supported by the National Natural Science Foundation of China (51875517, 51490663 and 51821093) and Key Research and
Development Program of Zhejiang Province (2017C01045).

Citation: Chan QIU, Shien ZHOU, Zhenyu LIU, Qi GAO, Jianrong TAN. Digital assembly technology based on augmented
reality and digital twins: a review. Virtual Reality & Intelligent Hardware, 2019, 1(6): 597—610
DOI: 10.1016/j.vrih.2019.10.002

Abstract Product assembly simulation is considered as one of the key technologies in the process of
complex product design and manufacturing. Virtual assembly realizes the assembly process design,
verification, and optimization of complex products in the virtual environment, which plays an active and
effective role in improving the assembly quality and efficiency of complex products. In recent years,
augmented reality (AR) and digital twin (DT) technology have brought new opportunities and challenges
to the digital assembly of complex products owing to their characteristics of virtual reality fusion and
interactive control. This paper expounds the concept and connotation of AR, enumerates a typical AR
assembly system structure, analyzes the key technologies and applications of AR in digital assembly, and
notes that DT technology is the future development trend of intelligent assembly research.

Keywords Augmented reality; Digital twin; Digital assembly; Virtual simulation; Assembly analysis

1 Introduction
With the development of products trending in the direction of complexity, miniaturization, and precision,
the assembly density and precision requirements of products are becoming higher, and the difficulty of
assembly is continuously increasing. Assembly design and process planning have become the weak links in
product design and manufacture. Product assembly simulation technology is an effective means to improve
the level of complex product assembly design and process planning.
Traditional assembly simulation technology builds a physical prototype to test whether its assembly
performance indicators are qualified, and the results are fed back to designers to improve and adjust the
assembly process. Owing to the need to produce a real physical prototype, the cost of products increases
and the time needed to introduce products in the market is adversely affected.
With the advancements in computer technology and graphics technology, virtual reality (VR) -based
digital assembly has gradually emerged, and it uses digital prototypes (DPs) for assembly operations
instead of physical prototypes. Using a virtual environment simulation of the real product assembly
process, the assembly performance can be predicted, thus avoiding the production material losses
associated with building physical prototypes. Using VR can significantly improve the accuracy and
efficiency of a complex product assembly design.
www.vr-ih.com
Virtual Reality & Intelligent Hardware 2019 Vol 1 Issue 6:597—610

However, although VR technology is becoming increasingly mature and plays an important role in the
process of product design and analysis, the following deficiencies remain in the assembly simulation of
complex products: (1) The product digital prototype and environment model established in a VR world
cannot accurately reflect the real assembly scene; (2) Users' senses are limited in the virtual environment,
and it is difficult to fully experience the real assembly process, resulting in low simulation reliability; and
(3) Despite the use of GPU acceleration technology, the real-time performance of VR assembly simulation
is not ideal.
In recent years, digital assembly technology has expanded in various directions with respect to ideal
assembly modeling, simulation, and analysis. Emphasis on combining the virtual environment with the real
assembly environment and assembly conditions, as well as the gradual transition from VR-based digital
assembly to augmented reality (AR) digital assembly and from digital assembly based on DP to digital
twins (DT) assembly has been ongoing. AR and DT technologies bring new opportunities and challenges
when applied to complex product assembly simulations. Some scholars have attempted to apply digital
twins to industrial production, as discussed in the sections that follow.
AR is a cutting-edge digital technology that emerged in the early 1990s. The concept was proposed by
Tom Caudell of Boeing and his colleagues when they designed an aircraft auxiliary wiring system[1]. In
1997, Azuma divided AR into three important components: virtual reality fusion, real-time interaction, and
3D registration, and postulated its potential application in many fields such as medical treatment,
entertainment, and industry[2]. Over the past two decades, many large international enterprises, such as
Microsoft, IBM, HP, Sony, and Google have conducted in-depth research into AR technology and have
developed AR auxiliary devices such as HoloLens and Meta[3], which greatly improved the application
value of AR technology in different fields and promoted the popularization of AR technology.
In the field of industry, researchers have attempted to apply AR technology to life cycle processes such
as product planning, design, assembly, and maintenance by combining virtual objects with a real-world
environment to construct an augmented virtual fusion environment, in which product behaviors and
attributes can be analyzed[4]. AR-based product assembly simulation technology is, to some extent, an
advanced method of VR assembly. It overcomes the tedious scene construction steps required by VR
methods through the superposition of the virtual model and real objects, and plays an important role in
helping users to carry out assembly design, planning, and operation. Dini et al. pointed out that AR is
compatible with through-life engineering services (TES), and has such advantages as no need for paper
manuals to access technical documents, accurate sensing of product status by sensors, assembly
maintenance and guidance, etc. However, these all require the AR system to be equipped with powerful
computing capacity to process large amounts of data[5]. Didier et al. applied AR to train assembly and
maintenance workers, and explained the maintenance steps to these users by combining a virtual model
with visual prompts[6]. Haritos et al. used optical devices to identify marks placed on aircraft parts, so that
users could receive virtual text information regarding a part in an enhanced environment[7]. De Crescenzio
et al. described the detailed application scene of AR in the maintenance of an aircraft fuel tank, and used
3D rectangular frame information to prompt the staff to perform operations such as opening the fuel tank
cap, observing the oil level, and closing the fuel tank cap[8]. Donget al. presented the measured tire pressure
and data acquired from other sensors in a visual form and analyzed them. Using AR technology, virtual
information regarding possible solutions corresponding to various parts of the vehicle was displayed to
realize fault diagnosis[9].
In recent years, researchers have developed DT by integrating DP and AR. It first appeared in a course
conducted by professor Grieves in Michigan University in 2003, and was defined as a digital system
598
Chan QIU et al: Digital assembly technology based on augmented reality and digital twins: a review

including physical and virtual products, and the connection between them in subsequent articles[10].
Initially, DT did not attract much attention, and it was not until 2011 when AFRL, the US Air Force
Research Laboratory, and NASA, the US National Aeronautics and Space Administration, proposed the
construction of a digital twinned body for a future aircraft that this technology received widespread
attention[11]. The DT defined by NASA is a digital model integrating multiple physical fields, multiple
scales, and multiple probabilities. It uses a physical model, sensor data, historical data, and other data to
interact with the simulation model in real time to evaluate the real-time evolution of the product state and
any significant trends[12]. Revetria et alintegrated applications of AR and DT into applications that improve
the safety of work environments[13].
DT is a process and method used to describe and model the characteristics, behavior, formation process,
and performance of physical entities via digital technology. It is a virtual model completely corresponding
to and consistent with physical entities in the real world, and it can simulate an entity's behavior and
performance in a real environment in real time. DT can not only use the existing theories and knowledge of
human beings to establish virtual models, but also use the simulation technology of virtual models to
explore and predict the unknown world to discover and find better methods and approaches.
At present, there are many research areas in AR, but few scholars have summarized the cutting-edge
technologies based on AR and DT assembly. A representative study is the literature review by Nee's
research team[14-16]. Therefore, this paper evaluates and reports on the development status of AR technology
in recent years and assembly simulation methods based on AR, aiming to provide ideas for R&D personnel
to build AR assembly systems, and analyzes the research hotspots and development trends of digital
assembly based on DT.

2 Structure of a typical AR assembly system


In contrast to the complete immersion effect achieved by traditional VR technology, AR is more inclined to
substitute the virtual model or information in the computer into the real world. Compared with VR, it
combines the model constructed in the virtual environment with the real environment around the user to
realize the interaction between the real world and the virtual world and to reduce the tedious scene
modeling and rendering work associated with VR.
To realize real-time integration and interaction between the product and assembly environment and the
AR environment, it is necessary to establish a digital assembly system platform based on AR. Research
members domestic and abroad have designed and developed a series of prototype systems based on AR for
different application objects. For example, the ARMAR system proposed by the US Air Force research
laboratory and Columbia University in 2007[17], the ACARS system developed by national university of
Singapore in 2012[18], and the ARPPSM system developed by the Algeria Advanced Technology
Development Center in 2013[19] have effectively solved the problems associated with assembly design.
Typical AR assembly systems are divided into six functional modules: Video Capture, Image Analysis
and Processing, Tracking Process, Interaction Handling, Assembly Information Management, and
Rendering[15], as shown in Figure 1.

2.1 Video capture

Video acquisition is the first step taken to build an AR assembly system; it is mainly realized by
using various cameras. Radkowski and Oliver[20] and Petersen et al. used CMOS/CCD cameras to cap‐
599
Virtual Reality & Intelligent Hardware 2019 Vol 1 Issue 6:597—610

Figure 1 Typical AR assembly system structure[15].

ture real assembly scenes of products[21]. Wang et al. used stereo cameras to build 3D assembly scenes[22,23],
and Khuong et al. used depth sensor cameras to capture real-time depth information. All camera types can
provide support for the follow-up vision-based tracking of parts to be assembled[24].

2.2 Image analysis and processing

Graphic algorithms are used to process assembly operation images, mainly through computer vision. As a
common function library, OpenCV is widely used[25]. For example, the continuously adaptive scheme-shift
algorithm is often used to detect a user's hand operations and can accurately identify the position of the
palm and fingertips and their binding relationship to the assembly parts.

2.3 Tracking process

Tracking processing uses real-time sensor or visual detection system responses to reflect the position and
state of assembly parts in an enhanced space. The sensors have the advantages of high accuracy, low delay,
low jitter, etc., which can be adapted to the tracking of parts in various assembly scenes, mainly including
electromagnetic sensors[26] and optical sensors[27]. The visual detection system is divided into two types:
labeled tracking and unlabeled tracking. The former has good robustness and accuracy, but has optical
occlusion problems[28]. The latter is not restricted by markers, but has weak adaptive capability[29].

3 Interaction handling

Interactive processing refers to using interactive AR systems such as glove interactive devices and desktop
interactive devices to combine a real assembly environment with virtual assembly parts[30]. Glove
interactive devices represented by a Hand Exoskeleton Device can capture hand movements and assembly
600
Chan QIU et al: Digital assembly technology based on augmented reality and digital twins: a review

gestures in real time[31]. Desktop interactive devices such as Phantom track the user's hand position and
assembly force; Sukan[32] built an interactive assembly system named ParaFrustum, which allows users to
view target assembly objects from appropriate viewpoints, thus avoiding the problem of viewpoint
occlusion.

4 Assembly information management


Assembly information management is used to store all assembly information extracted from a CAD model,
and it enables users to access and modify the implementation information during the AR assembly
operation without relying on a CAD system[33]. Raghavan et al. constructed the Liaison Graph system to
manage assembly information, which makes it convenient for designers to plan various assembly sequence
schemes[34]. Henderson and Feiner used an intelligent intention-based illustration system to dynamically
generate assembly process charts[35].

5 Rendering
Rendering is a process used to add parameters to the assembly model in the AR environment and output
them in the form of video to reduce the problems of insufficient information and accuracy in the
reconstruction of the assembly model. Rendering is divided into two categories: visual rendering and
haptic rendering. The former obtains the visual representation of assembly data through OpenGL and Open
Scene Graph[36], while the latter obtains the force feedback of users' contact through tactile devices[37].
An AR assembly system collects video through a camera to provide support for follow-up by visually
tracking the parts to be assembled. The image of an assembly operation is processed by a graphics
algorithm. The real-time response of sensors or a visual detection system is used to enhance the position
and status of assembled parts in space. The real assembly environment is combined with the virtual
assembly parts through interactive equipment, and the parameters of the assembly model are supplemented
by rendering them in the AR environment.

3 Key technologies of AR-based digital assembly


Digital assembly is used to analyze the assembly process and evaluate the error transfer of the product
digital model through a simulation method, and to verify whether the assembly sequence, assembly path,
and stress deformation of parts meet engineering requirements. It is very useful in guiding the actual parts
manufacturing and assembly operation, improving the efficiency of product research and development, and
reducing costs. The existing research on digital assembly of products based on AR technology is divided
into several aspects, such as virtual and real fusion modeling, assembly scene perception, assembly
operation navigation, and assembly process collaborative design.

3.1 Virtual reality fusion modeling technology

Virtual and real parts fusion modeling is a key technology used to obtain the position of virtual objects in
the real world through a camera, calculate their motion state in real time by using a tracking method, and
ensure the consistency of information of the parts in the two environments to perfectly integrate virtual
objects with the real environment. It is mainly realized by 3D registration technology.
All existing 3D registration methods involving AR technology use plane marks as positioning references
implying that the system structure is complex. Image processing requires a large amount of calculation,
601
Virtual Reality & Intelligent Hardware 2019 Vol 1 Issue 6:597—610

and error can be easily introduced into the process. Zhou et al. proposed a visual-based 3D registration
method using stereoscopic markers[38]. This method only needs a color CCD camera to complete the 3D
environment registration, which can effectively
simplify the registration system and algorithm, and
eliminate the matching errors when using multiple
sensors. Li et al. proposed a non-iterative solution
based on an efficient lookup table (LUT) to solve
the problem when only four corners of a square AR
tag are available[39]. The main idea is to extract the
key parameter β from the camera pose and create
the LUT of β by considering the symmetrical
structure of the square mark. Thus, a 3D
registration scheme with high stability can be
realized in the presence of noise, as shown in
Figure 2.
Figure 2 AR 3D registration based on visual effective
Leu et al. developed a simulation system that
lookup table[38].
acquired data from a 3D object surface to build a
CAD model, which then exchanged data with an AR system[40]. This system also provides motion capture,
mechanical modeling, multimodal rendering, and other operations, and integrates the design, planning,
evaluation, and testing functions of the assembly system. Zhang et al. proposed a method to realize 3D AR
registration by using the natural circle features of images[41]. This method makes use of a predefined plane
coordinate system and a circle feature in the target plane to estimate position and pose, and realizes 3D AR
registration according to the rotation matrix and translation vector. Hořejšaccording to confirm the
assembly on the marked points in the software environment in the real world space position, so as to define
the plane and data, using software solution processing network camera image data, add virtual 3D model
instruction to the real image and display, the method to shorten the time of assembly tasks[42]. Kim et al.
proposed using AR technology using a CAD model associated with a ship's hull image[43]. The method uses
a digital camera to obtain the 2D hull image, and the features are extracted from this image. An image
block is then set up and the corresponding relationship between the CAD model, calculation block of the
initial position, and lie algebra method is used to reduce the registration error and achieve a more precise
3D registration process. Wang et al. [44] proposed a 3D registration tracking method based on the fusion of a
model and natural feature points in which the linear parallel multimodal template matching method is
adopted to quickly identify the target object, obtain the reference view close to the current perspective, and
complete the rough estimation of camera posture, The RPnP algorithm is introduced to improve the
accuracy and speed of registration and tracking to overcome the lack of texture features and to reduce the
search space.

3.2 Assembly scene perception technology

In assembly scene perception, human errors in an AR assembly system are automatically identified by
tracking the user status in real time and displaying instructions in corresponding states to complete each
assembly work step and improve assembly efficiency and accuracy. Rentzos et al. proposed a scene
perception system integrating existing information and knowledge in a CAD/PDM system[45]. Based on the
semantics of products and processes, it supports complex tasks related to assembly and can be used to
assist workshop operators in completing the final assembly of products, as shown in Figure 3.
602
Chan QIU et al: Digital assembly technology based on augmented reality and digital twins: a review

Zhu et al. proposed a wearable AR auxiliary


system that integrates a Virtual Personal Assistant
(VPA) to provide natural language-based
interaction, user status recognition, location-aware
feedback, and other services to complete assembly
and maintenance tasks in the industry[46].
Through the real-time feedback of ergonomics,
researchers at home and abroad have studied the
changes in an operator's work efficiency based on
AR system scene perception to reduce the impact
of human fatigue during product assembly tasks.
Chen et al. proposed an adaptive guidance scene
display method that displays the synthetic
guidance scene from the best point of view in the
appropriate area[47]. Damen et al. proposed a real-
time AR guidance system to sense some activities Figure 3 Template instructions of specific assembly steps
near the space where the user is currently based on semantics[45].
located[26]. Vignais et al. proposed a system
combining a sensor network with AR technology to evaluate manual assembly tasks in real time, and to
feed ergonomics data back in real time during this period[48].

3.3 Assembly operation guidance technology

AR technology integrates a virtual model and real-world data to estimate the position and pose relationship
between parts to be assembled in real time, and generates guidance information regarding the assembly
mode in a mixed environment so that operators can complete assembly operations through interactive
navigation and thus improve assembly efficiency.
Wang et al. proposed a practical virtual component interaction system based on AR, as shown in Figure
[49,50]
4 . Users can use natural gestures during assembly simulation for interface operation, and through the
analysis of the constraints, the contact force between the model and the user force f determine the real
movement of a virtual component in an automobile clutch case study presented to verify the effectiveness
of the system and its intuitiveness.
Liu et al. estimated the pose of the assembly
matrix and parts and obtained the relative position
relationship between the assembly matrix and parts
in real time for position detection to realize the
registration of AR assembly navigation
[52]
information in real scenes . The method was then
applied to assembly examples of a car door plate
connector and car window drive motor. Wang and
Dunston provided a powerful industrial training
tool through AR, which can directly link
instructions on how to perform service tasks to the Figure 4 Virtual assembly of clutch based on natural
machine parts to be processed, improve the gestures[51].
603
Virtual Reality & Intelligent Hardware 2019 Vol 1 Issue 6:597—610

training program, and reuse valuable existing training materials to help technicians acquire new
maintenance skills[53]. Weble[54] developed an assembly skills training platform based on multi-mode AR to
display detailed information such as text in an interactive AR environment, and he introduced a novel
concept for displaying location-related information in an AR environment to compensate for the
uncertainty of tracking. Damiani et al. considered using augmented and virtual reality technologies for
workforce training to enable human resources to effectively interact with manufacturing information[55].
Zhu et al. believes that AR rendered content should not be "read only", but should be able to provide
technical personnel with AR content interaction methods, so he proposed an authorized situational
awareness AR system (ACARS) to assist maintenance technicians[18]. ACARS is context-aware and
creative, and it allows AR developers to create context-related information through a desktop 2D user
interface in cooperation with maintenance technicians. Hu et al. considered a G25-1 Marine single-screw
pump as the research object, matched its 3D CAD model with actual parts based on image recognition,
developed AR learning auxiliary software by using the Unity3D engine and EasyAR software development
kit, and realized a virtual disassembly training and dynamic operation demonstration of the single-screw
pump along with other functions[56].

3.4 Assembly process collaborative design technology

Assembly process collaborative design is an effective means in the AR process of product manufacturing
and assembly, which associates product design, factory layout planning, parts processing, and the product
assembly behavior of designers, planners, operators, and other personnel to facilitate their communication
with the goal of completing the entire product life cycle together. Liverani et al. proposed the concept of a
personal active assistant (PAA), which can be wirelessly connected to a design workstation to enhance
information pertaining to workers and Head Mounted Displays (HMDs) [57]. Ong et al. proposed a remote
collaboration system so that distributed users can
view the product model from different perspec-
tives, as shown in Figure 5[58]. Collaborative AR
provides significant advantages to assembly tasks,
including flexible collaboration time, complete
knowledge retention, in-depth understanding of
problems, etc., which can be perfectly integrated
with the existing workflow.
Most collaborative AR studies focus only on the
interactions of different users with enhanced
scenarios, while few focus on the interactions and
collaborations among users.
Oda and Feiner[59] designed gesturing in an
augmented reality deep-mapped environment
(GARDEN), which can use sphere projection to
enhance the information transfer among users.
Ranatunga et al. allows experts to use multi-touch
gestures to translate, rotate, and annotate objects in
video for remote guidance in collaboration with Figure 5 Architecture of remote collaboration system
other users[60]. based on constraints[57].

604
Chan QIU et al: Digital assembly technology based on augmented reality and digital twins: a review

4 Current hotspot and development trend: AR based assembly combined


with DT
At present, most of the AR-based assembly systems consider the assembly navigation based on a static
model and data in an ideal situation first. During the assembly process, real-time adjustments and
optimization cannot be carried out based on the performance state of the assembly, which may cause
unstable assembly quality.
In recent years, DT has attracted significant attention from famous universities, research institutes, and
enterprises because of its entire life cycle, from real to virtual, and because of its capability to control real
actions through virtual ones. Gartner, the world's most authoritative IT research and consulting firm, has
listed DT as one of the top ten strategic technology development trends of the year for three consecutive
years (from 2017 to 2019) [61]. Through virtual and real interactive feedback, data fusion analysis, decision-
making iterative optimization, and other means, DT simulates the operating behavior of physical entities in
the real environment with a virtual model to provide a bridge between the physical world and the
information world over the entire life cycle of products[62]. Along with the synchronous evolution of real
products in the life cycle, DT adds support or extended functions to physical entities[63], providing new
ideas and means for solving assembly problems.
There have been some reports on the application of DT technology in product twinning modeling and
assembly precision analysis. For example, Damiani et al. solved the processing problems associated with
simultaneously accessing massive data having many internal semantic dependencies[64]. Söderberg et al. [51]
and Wärmefjord et al.[65] summarized the real-time data extraction and performance simulation optimization
technologies required to build a DT model that ensures the geometric accuracy of products in the
manufacturing process, including positioning scheme optimization, statistical analysis, virtual edge cutting,
assembly sequence optimization, and other methods. Based on the concept of surface model shape,
[66]
Schleich et al. constructed a complete agent model that comprehensively considered flexibility,
expansibility, operability, and fidelity, aiming to design digital twins of actual physical products for
assembly analysis. Wang[67] proposed a DT-based assembly docking technology for low-pressure turbine
units of aero engines, using multiple sensors to map and connect data between digital models and physical
objects. With the aid of a physical terminal control, the system was able to realize real-time posture
adjustments during the low pressure turbine unit cell installation process, and realized an aero engine
docking process of low pressure turbine unit cell based on a 3D virtual assembly simulation process of the
physical fusion, model fusion, and data fusion. Zhang et al. constructed the DT of a spacecraft to express
its in-orbit assembly process, status, and behavior, and analyzed the data composition, implementation
mode, and function of the DT according to the structural composition and functional requirements of the
spacecraft[68]. Finally, he proposed the implementation approach of the DT for the entire life cycle of the
spacecraft.
Simply stated, it is of great significance to use twin data to construct parts and assembly twins with real
characteristics and to develop design assembly analysis methods suitable for twin models to accurately
predict the assembly accuracy of products and guide the assembly behavior of real products. Through real
data-driven assembly models, the assembly state of the physical model can be synchronized with the
virtual model in real time, the assembly performance can be predicted, and optimal assembly operation can
be provided. Therefore, the integration of AR and DT will be the most promising development direction
for high-precision intelligent assembly technology in the future.
605
Virtual Reality & Intelligent Hardware 2019 Vol 1 Issue 6:597—610

5 Conclusion
(1) AR and DT technology are research hotspots in the complex products digital assembly field. This
paper reviews the concepts and key technologies behind AR, introduces the structure of a typical AR
assembly system, and analyzes the connotation of DT and its application in digital assembly.
(2) AR-based digital assembly technology is applied to realize virtual reality fusion modeling, assembly
scene perception, assembly operation navigation, assembly process collaborative design, etc. Through the
interaction between virtual assembly objects and the real assembly environment, the quality and efficiency
of assembly design are effectively improved.
(3) A DT-driven digital assembly model simulates the assembly behavior of physical entities in the real
environment. With the synchronous evolution of real products in the life cycle, it greatly improves the
accuracy of product assembly performance prediction.
(4) A DT system synchronizes the assembly state of the physical model and predicts the assembly
performance. It allows for real-time update of assembly guidance through AR, and for the prediction and
guarantee of assembly performance. The integration of AR and DT technology is bound to become an
important development trend regarding intelligent assembly technology involving complex products.

References

1 Carmigniani J, Furht B, Anisetti M, Ceravolo P, Damiani E, Ivkovic M. Augmented reality technologies, systems and
applications. Multimedia Tools and Applications, 2011, 51(1): 341–377
DOI:10.1016/j.procir.2015.07.044
2 Azuma R T. A survey of augmented reality. Presence: Teleoperators and Virtual Environments, 1997, 6(4): 355–385
DOI:10.1162/pres.1997.6.4.355
3 van Doormaal T P C, van Doormaal J A M, Mensink T. Clinical accuracy of holographic navigation using point-based
registration on augmented-reality glasses. Operative Neurosurgery, 2019, 1–6
DOI:10.1093/ons/opz094
4 Nee A Y C, Ong S K, Chryssolouris G, Mourtzis D. Augmented reality applications in design and manufacturing. CIRP
Annals, 2012, 61(2): 657–679
DOI:10.1016/j.cirp.2012.05.010
5 Dini G, Mura M D. Application of augmented reality techniques in through-life engineering services. Procedia CIRP,
2015, 38: 14–23
DOI:10.1016/j.procir.2015.07.044
6 Didier J-Y, Roussel D, Mallem M, Otmane S, Naudet S, Pham Q-C, Bourgeois S, Mégard C, Leroux C, Hocquard A.
AMRA: Augmented Reality Assistance for Train Maintenance Tasks. In: Workshop Industrial Augmented Reality, 4th
ACM/IEEE International Symposium on Mixed and Augmented Reality (ISMAR 2005). Vienna, Austria, IEEE, 2005
7 Haritos T, MacChiarella N D. A mobile application of augmented reality for aerospace maintenance training. In: 24th
Digital Avionics Systems Conference. Washington, DC, USA, IEEE, 2005
DOI:10.1109/dasc.2005.1563376
8 De Crescenzio F, Fantini M, Persiani F, Di Stefano L, Azzari P, Salti S. Augmented Reality for Aircraft Maintenance
Training and Operations Support. IEEE Computer Graphics and Applications, 2011, 31(1): 96–101
DOI:10.1109/MCG.2011.4
9 Dong S Y, Feng C, Kamat V R. Sensitivity analysis of augmented reality-assisted building damage reconnaissance using
virtual prototyping. Automation in Construction, 2013, 33: 24–36
DOI:10.1016/j.autcon.2012.09.005
10 Grieves M. Digital Twin: Manufacturing excellence through virtual factory replication. Whitepaper, 2015
11 Tao F, Liu W, Liu J, Liu X, Liu Q, Qu T, Xiang F. Digital twin and its potential application exploration. Computer
Integrated Manufacturing Systems, 2018, 24: 1–18

606
Chan QIU et al: Digital assembly technology based on augmented reality and digital twins: a review

DOI:10.13196/j.cims.2018.01.001
12 Li C Z, Mahadevan S, Ling Y, Choze S, Wang L P. Dynamic Bayesian network for aircraft wing health monitoring
digital twin. AIAA Journal, 2017, 55(3): 930–941
DOI:10.2514/1.j055201
13 Revetria R, Tonelli F, Damiani L, Demartini M, Bisio F, Peruzzo N. A real-time mechanical structures monitoring
system based on digital twin, iot and augmented reality. In: 2019 Spring Simulation Conference (SpringSim). Tucson,
AZ, USA, IEEE, 2019
DOI:10.23919/springsim.2019.8732917
14 Ong S K, Yuan M L, Nee A Y C. Augmented reality applications in manufacturing: a survey. International Journal of
Production Research, 2008, 46(10): 2707–2742
DOI:10.1080/00207540601064773
15 Nee A Y C, Ong S K, Chryssolouris G, Mourtzis D. Augmented reality applications in design and manufacturing. CIRP
Annals, 2012, 61(2): 657–679
DOI:10.1016/j.cirp.2012.05.010
16 Wang X, Ong S K, Nee A Y C. A comprehensive survey of augmented reality assembly research. Advances in
Manufacturing, 2016, 4(1): 1–22
DOI:10.1007/s40436-015-0131-4
17 Henderson S J, Feiner S K. Augmented reality in the psychomotor phase of a procedural task. In: Mixed and Augmented
Reality (ISMAR), 2011 10th IEEE International Symposium. Basel, Switzerland, IEEE, 2011, 191–200
DOI: 10.1109/ISMAR.2011.6092386
18 Zhu J, Ong S K, Nee A Y C. An authorable context-aware augmented reality system to assist the maintenance
technicians. The International Journal of Advanced Manufacturing Technology, 2013, 66(9/11/12): 1699–1714
DOI:10.1007/s00170-012-4451-2
19 Benbelkacem S, Belhocine M, Bellarbi A, Zenatihenda N, Tadjine M. Augmented reality for photovoltaic pumping
systems maintenance tasks. Renewable Energy, 2013, 55(55): 428–437
DOI:10.1016/j.renene.2012.12.043
20 Radkowski R, Oliver J H. Natural Feature Tracking Augmented Reality for On-Site Assembly Assistance Systems.
International Conference on Virtual, Augmented and Mixed Reality, 2013, 8022: 281–290
DOI:10.1007/978-3-642-39420-1_30
21 Petersen N, Pagani A, Stricker D. Real-time modeling and tracking manual workflows from first-person vision. In: 2013
IEEE International Symposium on Mixed and Augmented Reality (ISMAR). Adelaide, SA, Australia, IEEE, 2013, 117
–124
22 Wang Z B, Ong S K, Nee A Y C. Augmented reality aided interactive manual assembly design. The International Journal
of Advanced Manufacturing Technology, 2013, 69(5/6/7/8): 1311–1321
DOI:10.1007/s00170-013-5091-x
23 Wang Z B, Ng L X, Ong S K, Nee A Y C. Assembly planning and evaluation in an augmented reality environment.
International Journal of Production Research, 2013, 51(23/24): 7388–7404
DOI:10.1080/00207543.2013.837986
24 Khuong B M, Kiyokawa K, Miller A, La Viola J J, Mashita T, Takemura H. The effectiveness of an AR-based context-
aware assembly support system in object assembly. In: 2014 IEEE Virtual Reality (VR). Minneapolis, MN, USA, IEEE,
2014
DOI:10.1109/vr.2014.6802051
25 OpenCV. http://opencv.willowgarage.com/wiki/
26 Damen D M, Gee A, Mayol-Cuevas W, Calway A. Egocentric Real-time Workspace Monitoring using an RGB-D
camera. In: 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems. Vilamoura-Algarve, Portugal,
IEEE, 2012
DOI:10.1109/iros.2012.6385829
27 Henderson S J, Feiner S K. Augmented reality in the psychomotor phase of a procedural task. In: 2011 10th IEEE
International Symposium on Mixed and Augmented Reality. Basel, Switzerland, IEEE, 2011

607
Virtual Reality & Intelligent Hardware 2019 Vol 1 Issue 6:597—610

DOI:10.1109/ismar.2011.6162888
28 Wang Y T, Shen Y, Liu D C, Wei S, Zhu C G. Key technique of assembly system in an augmented reality environment.
In: 2010 Second International Conference on Computer Modeling and Simulation. Sanya, China, IEEE, 2010
DOI:10.1109/iccms.2010.201
29 Andersen M, Andersen R, Larsen C, Moeslund T B, Madsen O. Interactive assembly guide using augmented reality.
Advances in Visual Computing. Berlin, Heidelberg, Springer, 2009, 999–1008
DOI:10.1007/978-3-642-10331-5_93
30 Yuan M L, Ong S K, Nee A Y C. Augmented reality for assembly guidance using a virtual interactive tool. International
Journal of Production Research, 2008, 46(7): 1745–1767
DOI:10.1080/00207540600972935
31 Charoenseang S, Panjan S. 5-finger exoskeleton for assembly training in augmented reality. Virtual and Mixed Reality-
New Trends. Berlin, Heidelberg, Springer, 2011, 30–39
DOI:10.1007/978-3-642-22021-0_4
32 Sukan M, Elvezio C, Oda O, Feiner S, Tversky B. ParaFrustum: visualization techniques for guiding a user to a
constrained set of viewing positions and orientations. In: Proceedings of the 27th annual ACM symposium on User
interface software and technology. Honolulu, Hawaii, USA, ACM, 2014: 331–340
33 Wang X, Ong S, Nee A. Augmented reality interfaces for industrial assembly design and planning. In: Proceedings of
Eighth International Conference on Interfaces and Human Computer Interaction. Lisbon, 2014, 83–90
34 Raghavan V, Molineros J, Sharma R. Interactive evaluation of assembly sequences using augmented reality. IEEE
Transactions on Robotics and Automation, 1999, 15(3): 435–449
DOI:10.1109/70.768177
35 Henderson S, Feiner S. Exploring the benefits of augmented reality documentation for maintenance and repair. IEEE
Transactions on Visualization and Computer Graphics, 2011, 17(10): 1355–1368
DOI:10.1109/tvcg.2010.245
36 Hou L, Wang X Y, Bernold L, Love P E D. Using animated augmented reality to cognitively guide assembly. Journal of
Computing in Civil Engineering, 2013, 27(5): 439–451
DOI:10.1061/(asce)cp.1943-5487.0000184
37 Gavish N, Gutiérrez T, Webel S, Rodríguez J, Peveri M, Bockholt U, Tecchia F. Evaluating virtual reality and
augmented reality training for industrial maintenance and assembly tasks. Interactive Learning Environments, 2015, 23
(6): 778–798
DOI:10.1080/10494820.2013.815221
38 Zhou Y, Yan D Y, Wang Y T, Chang H, Xu T. A registration method in augmented reality. Journal of Image and
Graphics, 2000, 5(5): 430–433 (in Chinese)
39 Li S Q, Xu C. Efficient lookup table-based camera pose estimation for augmented reality. Computer Animation and
Virtual Worlds, 2011, 22(1): 47–58
DOI:10.1002/cav.385
40 Leu M C, ElMaraghy H A, Nee A Y C, Ong S K, Lanzetta M, Putz M, Zhu W J, Bernard A. CAD model based virtual
assembly simulation, planning and training. CIRP Annals, 2013, 62(2): 799–822
DOI:10.1016/j.cirp.2013.05.005
41 Zhang Y, Li M, Zeng B. The 3D registration based on the single circle of image and application in augmented assembly.
Mechanical Science And Technology for Aerospace Engineering, 2014, 33(6): 881–885
42 Hořejší P. Augmented reality system for virtual training of parts assembly. Procedia Engineering, 2015, 100: 699–706
DOI:10.1016/j.proeng.2015.01.422
43 Kim D, Park J, Ko K H. Development of an AR based method for augmentation of 3D CAD data onto a real ship block
image. Computer-Aided Design, 2018, 98: 1–11
44 Wang Y, Zhang S, He W P, Bai X L. Model-based marker-less 3D tracking approach for augmented reality[J]. Journal of
Shanghai Jiaotong University, 2018, 52(1):83–89 (in Chinese)
DOI:10.16183/j.cnki.jsjtu.2018.01.013
45 Rentzos L, Papanastasiou S, Papakostas N, Chryssolouris G. Augmented reality for human-based assembly: using

608
Chan QIU et al: Digital assembly technology based on augmented reality and digital twins: a review

product and process semantics. IFAC Proceedings Volumes, 2013, 46(15): 98–101
DOI:10.3182/20130811-5-us-2037.00053
46 Zhu Z W, Branzoi V, Wolverton M, Murray G, Vitovitch N, Yarnall L, Acharya G, Samarasekera S, Kumar R. AR-
mentor: Augmented reality based mentoring system. In: 2014 IEEE International Symposium on Mixed and Augmented
Reality (ISMAR). Munich, Germany, IEEE, 2014
DOI:10.1109/ismar.2014.6948404
47 Chen C J, Hong J, Wang S F. Automated positioning of 3D virtual scene in AR-based assembly and disassembly guiding
system. The International Journal of Advanced Manufacturing Technology, 2015, 76: 753–764
DOI:10.1007/s00170-014-6321-6
48 Vignais N, Miezal M, Bleser G, Mura K, Gorecky D, Marin F. Innovative system for real-time ergonomic feedback in
industrial manufacturing. Applied Ergonomics, 2013, 44(4): 566–574
DOI:10.1016/j.apergo.2012.11.008
49 Wang X, Ong S K, Nee A Y C. Augmented reality interfaces for industrial assembly design and planning. In:
Proceedings of the Eighth International Conference on Interfaces and Human Computer Interaction. Lisbon, Portugal,
2014, 83–90
50 Wang X, Ong S K, Nee A Y C. Real-virtual components interaction for assembly simulation and planning. Robotics and
Computer-Integrated Manufacturing, 2016, 41, 102–114
51 Söderberg R, Wärmefjord K, Carlson J S, Lindkvist L. Toward a Digital Twin for real-time geometry assurance in
individualized production. CIRP Annals, 2017, 66(1): 137–140
DOI:10.1016/j.cirp.2017.04.038
52 Liu R, Fan X, Yin X Y, Wang J K, Yang X. Research on combinational pose estimation of base part and assembly part
for augmented reality aided assembly. Machine Design and Research, 2018, 34(6): 119–125 (in Chinese)
53 Wang X, Dunston P S. Design, strategies, and issues towards an augmented reality-based construction training platform.
Journal of Information Technology in Construction (ITcon), 2007, 12(25): 363–380
54 Webel S, Bockholt U, Engelke T, Gavish N, Olbrich M, Preusche C. An augmented reality training platform for
assembly and maintenance skills. Robotics and Autonomous Systems, 2013, 61(4): 398–403
DOI:10.1016/j.robot.2012.09.013
55 Damiani L, Demartini M, Guizzi G, Revetria R, Tonelli F. Augmented and virtual reality applications in industrial
systems: A qualitative review towards the industry 4.0 era. IFAC-PapersOnLine, 2018, 51(11): 624–630
DOI:10.1016/j.ifacol.2018.08.388
56 Hu J Z. Design of auxiliary learning software of augmented reality technique based on image recognition. Mechanical
Engineering & Automation, 2018(3), 38–40
57 Liverani A, Amati G, Caligiana G. A CAD-augmented reality integrated environment for assembly sequence check and
interactive validation. Concurrent Engineering, 2004, 12(1): 67–77
DOI:10.1177/1063293X04042469
58 Ong S K, Shen Y. A mixed reality environment for collaborative product design and development. CIRP Annals, 2009,
58(1): 139–142
DOI:10.1016/j.cirp.2009.03.020
59 Oda O, Feiner S. 3D referencing techniques for physical objects in shared augmented reality. 2012 IEEE International
Symposium on Mixed and Augmented Reality (ISMAR). Atlanta, GA, USA, IEEE, 2012, 207–215
DOI: 10.1109/ISMAR.2012.6402558
60 Ranatunga D, Feng D, Adcock M, Thomas B. Towards object based manipulation in remote guidance. In: 2013 IEEE
International Symposium on Mixed and Augmented Reality (ISMAR). Adelaide, Australia, IEEE, 2013
DOI:10.1109/ismar.2013.6671839
61 Gartner Top 10 Strategic Technology Trends for 2019. https://www. gartner. com/smarterwithgartner/gartner-top-10-
strategic-technology-trends-for-2019/
62 Tao F, Zhang M. Digital twin shop-floor: A new shop-floor paradigm towards smart manufacturing. IEEE Access, 2017,
5: 20418–20427
DOI:10.1109/access.2017.2756069

609
Virtual Reality & Intelligent Hardware 2019 Vol 1 Issue 6:597—610

63 Tao F, Zhang M, Cheng J F, Qi Q L. Digital twin workshop: a new paradigm for future workshop. Computer Integrated
Manufacturing Systems, 2017, 23(1): 1–9 (in Chinese)
DOI:10.13196/j.cims.2017.01.001
64 Damiani L, Demartini M, Giribone P, Maggiani M, Revetria R, Tonelli F. Simulation and Digital Twin Based Design of
a Production Line: A Case Study. In: Proceedings of the International MultiConference of Engineers and Computer
Scientists. 2018
65 Wärmefjord K, Söderberg R, Lindkvist L, Lindau B, Carlson J S. Inspection Data to Support a Digital Twin for
Geometry Assurance. In: ASME 2017 International Mechanical Engineering Congress and Exposition. American
Society of Mechanical Engineers, 2017
66 Schleich B, Anwer N, Mathieu L, Wartzack S. Shaping the digital twin for design and production engineering. CIRP
Annals, 2017, 66(1): 141–144
67 Wang L. Research on the docking technology of final installation for aeroengine low pressure turbine unit based on
digital twin. Computer Measurement & Control, 2018, 26(10): 286–290, 303 (in Chinese)
68 Zhang Y L, Zhang J P, Wang X D, Chen X B. Digital twin technology for spacecraft on-orbit assembly. Navigation and
Control, 2018, 17(3): 75–82 (in Chinese)

610

You might also like