You are on page 1of 15

Integrated product design and assembly

planning in an augmented reality environment

L.X. Ng
NUS Graduate School for Integrative Sciences and Engineering, National University of Singapore, Singapore

Z.B. Wang
Mechanical Engineering Department, National University of Singapore, Singapore, and

S.K. Ong and A.Y.C. Nee

NUS Graduate School for Integrative Sciences and Engineering & Mechanical Engineering Department,
National University of Singapore, Singapore
Purpose The purpose of this paper is to present a methodology that integrates design and assembly planning in an augmented reality (AR)
environment. Intuitive bare-hand interactions (BHIs) and a combination of virtual and real objects are used to perform design and assembly tasks.
Ergonomics and other assembly factors are analysed during assembly evaluation.
Design/methodology/approach An AR design and assembly (ARDnA) system has been developed to implement the proposed methodology. For
design generation, 3D models are created and combined together like building blocks, taking into account the product assembly in the early design
stage. Detailed design can be performed on the components and manual assembly process is simulated to evaluate the assembly design.
Findings A case study of the design and assembly of a toy car is conducted to demonstrate the application of the methodology and system.
Research limitations/implications The system allows the users to consider the assembly of a product when generating the design of the
components. BHI allows the users to create and interact with the virtual modes with their hands. Assembly evaluation is more realistic and takes into
consideration the ergonomics issues during assembly.
Originality/value The system synthesizes AR, BHI and a CAD software to provide an integrated approach for design and assembly planning, intuitive
and realistic interaction with virtual models and holistic assembly evaluation.
Keywords Integrated design and assembly, Augmented reality, Bare-hand interaction, Design and assembly evaluation, Ergonomics, Design,
Paper type Research paper

(Coutee et al., 2001; Kim and Vance, 2004; Seth et al.,

2006; Gupta et al., 1997) and data gloves (Jayaram et al.,
1999; Wan et al., 2004) are implemented in VR assembly
systems for haptic feedback during assembly simulation. The
parts-to-parts and hands-to-parts physical interactions are
modelled and simulated to provide haptic feedbacks (Lim et al.,
2007). VR assembly systems can be used to observe the
operators behaviour and analyse complex assembly issues
(Chryssolouris et al., 2000; Jayaram et al., 2007). Such systems
model the interactions of the actual assembly to address
ergonomics issues. A VR assembly system has to support
geometric, design and assembly rules, perform collision
detection, identify and manage inter-part constraints, support
physics-based modelling and provide high fidelity interactions
for ergonomics analysis (Seth et al., 2010). The last
requirement may be difficult and computationally costly to
achieve in a totally VR assembly system as the assembly
simulation environment is entirely virtual, where various
conditions and elements that can affect the assembly process
have to be modelled.
Augmented reality (AR) can augment simulation results in
a real workspace. AR systems track the real objects and
environments and register the virtual contents onto them. AR
has been implemented for assembly guidance with contextual
information and instructions (Wiedenmaier et al., 2003;
Andersen et al., 2009; Zhang et al., 2011; Chimienti et al.,
2010), assembly design (Lee et al., 2010) and assembly

1. Introduction
Product design and assembly planning are critical to the
product development (PD) process. Generally, design of
components is generated before deciding their assembly and
sequence. The assembly is evaluated and modifications are
made when required. Design for assembly (DFA) (Boothroyd
et al., 2001) aims to design products prioritising assembly by
reducing the components and designing easy-to-assemble
components based on DFA rules. Ergonomics in assembly
design studies the layout of the workspace, the sequence, the
tools and components handling and the effects on human.
DFA tools allocate time to the human actions required for
assembly but there is no consideration of their effects.
Virtual reality (VR) assembly tools create an immersive VR
environment with 3D representations of a product, the
human operator and the work environment. Advanced 3D
human-computer interaction (HCI) interfaces are used to
simulate the manual assembly process. Haptic devices
The current issue and full text archive of this journal is available at

Assembly Automation
33/4 (2013) 345 359
q Emerald Group Publishing Limited [ISSN 0144-5154]
[DOI 10.1108/AA-10-2012-058]


Integrated product design and assembly planning

Assembly Automation

L.X. Ng, Z.B. Wang, S.K. Ong and A.Y.C. Nee

Volume 33 Number 4 2013 345 359

sequences evaluation (Raghavan et al., 1999), and integration

of assembly feature design with workplace design (Ong et al.,
2007). Data gloves (Valentini, 2009) and bare-hand
interaction (BHI) (Ong and Wang, 2011) can be used in
AR assembly systems for assembly performance evaluation,
e.g. assembly sequence planning (Liverani et al., 2004).
AR is potentially cheaper and more realistic tool for
assembly design than VR. AR systems can utilize established
techniques in VR and evaluate the assembly more realistically
with a combination of virtual and real objects in the actual
workspace. The introduction of AR in assembly design
complements existing methods, e.g. DFA and CAD. In this
paper, an integrated design and assembly planning
methodology is proposed and implemented in an AR design
and assembly (ARDnA) system using BHI. ARDnA
superimposes 3D virtual models onto the real objects and
environment, and tracks the users bare hands to achieve BHI
for HCIs. The users can create, manipulate, grab and
assemble the virtual models using bare hands. The design
generated can be refined using CAD software and assembly
evaluation and ergonomics study can be conducted.
The contributions of the research presented in this paper are
first integrated design and assembly which is achieved through
a constructive approach for generating design using BHI and
constructive solid geometry (CSG). Intuitive design and
assembly evaluation can be achieved, whereby virtual 3D
models can be manipulated like real objects. Third, design and
assembly with real and virtual objects can be performed for the
creation and evaluation of realistic augmented prototypes.
The paper is organized as follows. Section 2 presents the
integrated design and assembly methodology. Section 3
describes the ARDnA system. Section 4 presents the BHI and
hand strain determination methodologies. Section 5 describes
the design generation process. Section 6 presents the assembly
evaluation process. A case study is presented in Section 7.
Section 8 concludes this paper.

The parameters of different sequences are compared to

determine an optimal sequence. A survey is conducted at the
end of an assembly process to evaluate the assembly subjectively.
General minor editing of the components can be performed.
Figure 1 shows the workflow of the methodology.

3. ARDnA system architecture

The system hardware (Figure 2) consists of a desktop computer
(dual core 2.20 GHz processor, 4 GB SDRAM and 512 MB
graphic card), a stereo camera (PGR BumbleBee2), a web
camera (PGR Firefly2), a LCD monitor and a head mounted
display (HMD) (Vuzix Wrap 920). The system is developed in
C/C using Visual Studio 2008, and open source APIs and
libraries, such as OpenCV (OpenCV, 2012) for image
processing, ARToolkit (ARToolkit, 2007) for marker tracking,
FlyCapture SDK for stereo imaging (PGR, 2011), SolidWorks
API (SolidWorks Corporation, 2012) for CAD modelling,
OpenGL (OpenGL, 1997) for 3D rendering, and V-Collide
(V-Collide, 1998) for collision detection.
In this system, design generation and assembly evaluation are
performed in a real environment, which can be the actual
assembly workspace. In this AR design environment, 3D models
can be created and registered to achieve interaction between
virtual and real objects and the environment. Real objects can
be used for design generation by reconstructing them as 3D
models and tracking their poses using markers. Both real and
virtual objects coexist in a common coordinate system. The user
Figure 1 Integrated design and assembly planning methodology in

2. Integrated design and assembly planning

The proposed methodology is an integrated approach where
design and assembly tasks are synergised, and complemented
with AR for realistic interaction. Basic components can be
created and assembled by the users. BHI enhances the realism
of the design and assembly tasks by replicating the design and
manual assembly in context. The methodology supports the
design of new products and consists of two phases, namely,
design generation and assembly evaluation.
During design generation, a product is created by
constructing, assembling and combining virtual primitives of
the components to form the product supported by BHI. This
is analogous to building with constructive block sets and
allows the user to think and experiment with his hands. The
general assembly of the product is developed and the CAD
part and assembly files are generated. As it is difficult to
perform detailed design of the components using BHI, precise
design modifications are performed using a CAD software.
The assembly design evaluation phase will commence when
the detailed design of the components has been completed.
In the assembly design evaluation phase, the components are
assembled sequentially. An assembly sequence is evaluated
objectively taking into account parameters such as the time taken,
number of orientation changes, errors and detected hand strains.

Integrated product design and assembly planning

Assembly Automation

L.X. Ng, Z.B. Wang, S.K. Ong and A.Y.C. Nee

Volume 33 Number 4 2013 345 359

Figure 2 ARDnA system setup

Stereo Camera

LCD Monitor
HMD View
Monitor View
Web Camera

Real Object
Head Mounted Device

Virtual models

can use his bare hands to create, grab and manipulate 3D models
in the design environment to assemble the virtual components.
The system consists of five modules, namely, the AR tracking
module (ARTM), the BHI module (BHIM), the data exchange
module (DEM), the CAD module (CADM), and the
visualization module (VM) (Figure 3). ARTM performs
tracking and registration. BHIM detects and tracks the hands
and fingers of the user and calculates their 3D poses for

interactions with the virtual models. DEM integrates the design

and assembly information in the AR design environment and
the CAD software to ensure data consistency. CADM provides
basic modelling for design generation and constraints
information for assembly evaluation, as well as detailed design
modelling. VM renders the virtual models with the real objects.
The design generation and assembly evaluation phases span
across these modules.

Figure 3 ARDnA system architecture

AR Toolkit OpenCV
FlyCapture SDK

AR WCS origin marker
object markers

Virtual models

- Detection
- Recognition
- Pose Estimation
- Hand Strain

Real objects

Data Exchange
Collision Detection Library

Bare Hand Interaction

Hand and

Stereo camera
Web camera

Tracking Module


CAD Software Module

Design Parameters
- Dimensions
- Positions
- Orientations

Part Creation
Addition and Position
Combination of Parts


Detailed design
Assembly constraints

- Mating constraints
- Geometric

Assembly creation
Mates definition
Solid Works API

Visualization Module
3D Rendering

Display Tools


Desktop Monitor
Head Mounted Device



Integrated product design and assembly planning

Assembly Automation

L.X. Ng, Z.B. Wang, S.K. Ong and A.Y.C. Nee

Volume 33 Number 4 2013 345 359

3.1 AR tracking module

ARTM tracks the users hands, real objects and AR markers
and register the virtual models with the real objects in the AR
environment. An AR world coordinate system (ARWCS) is
established with the origin at the centre of a planar marker
using ARToolkit (ARToolkit, 2007). The 3D poses of all
objects are referenced from this origin and their relative poses
define the design and assembly parameters. The origin marker
can be moved for different viewpoints of the virtual models.
Figure 4 shows the framework of the ARTM and the
relationships between the objects and ARWCS.
Optical tracking using a stereo camera is utilised to capture
the 3D information. The stereo camera is mounted 80 cm
above the design workspace which has a volume of
approximately 50 50 40 cm. A second camera, placed
near the users eyes on top of a HMD, provides a perspective
that is consistent with the users view. Tracking information
from the stereo camera is relayed to the second camera.
During tracking of the virtual models, the poses of the
virtual models are represented as transformation matrices in
ARWCS; these poses are modified when the virtual models
are in contact with the users bare hands. The thumbs and
index fingers act as control points to manipulate the virtual
models. Collision between the virtual models and real objects
can affect the 3D poses of the virtual models.
Real objects are represented as 3D models and tracked
using markers that are affixed onto them. Existing 3D models
of real objects are used directly, otherwise 3D models of these
objects can be constructed using commercial CAD software.
The models of the real objects are rendered over the real
objects but made transparent so that the real objects and
virtual models can be perceived to be interacting. The relative
poses between the markers on the real objects and the origin
marker are estimated using ARToolkit. The relative pose
between a marker on a real object and this object is predefined
for all the real objects. The poses of the real objects in the
ARWCS are derived from these relative poses.

3.3 Data exchange module

DEM is the interface between BHIM and CADM. A design
generated using BHI is conceptual and imprecise, whereas
precise parameters are required in the CAD software for
engineering design. The design information in BHIM and
CADM has to be consistent to achieve design generation,
detailed design and assembly evaluation. A product model
contains the component models and the assembly model.
The component model represents the design and assembly
information of a component and contains three types of
information, namely, design parameters, assembly constraints
and the child models. The assembly model contains
information of the assembly constraints between the
assembled components and the final poses of the components.
During assembly evaluation, BHIM will modify the poses of the
components and the assembly constraints in the assembly
model will be used to constrain the components movements.
At the end of an assembly evaluation, the parameters (assembly
sequence, time taken, orientation changes required, errors and
hand strain detected) will be recorded in the assembly model.
Figure 5 shows the structure of the product model.
A data structure is developed to store and manage the design
parameters and assembly constraints. The design parameters
of a component consist of the dimensions of the bounding
box, absolute position and orientation with reference to the
ARWCS, CSG primitive type, Boolean operations between
the component and its parent, child/parent relationships, and
the lists of features and faces. During design generation, the
dimensions, positions and orientations, the CSG type and
Boolean operations (default is addition) are determined in the
BHIM. As the parameters are derived from the poses,
translations and rotations of the hands, they are imprecise.
DEM will transfer these design parameters to CADM to
create the models via the CAD API. The design parameters are
corrected manually in the CADM and DEM will update the
component model to ensure data consistency. The list of
features allows the selection and modification of design features
and the list of faces is used for rendering and collision
detection. The first component of a product is the base
component and it is the parent to all the other components.
The child models are the components that are assembled to the
base component and are stored as features of the parent model.
The assembly constraints are created after detailed design is
completed in the CADM. It is derived from the CAD
assembly model and consists of the type of constraints and the
affected faces or axes for the assembled models. The final
position and orientation of a model are stored. In addition,
the faces of the models are updated with the geometric
constraints depending on their surface types.

3.2 BHI module

The BHIM detects the users hands and fingertips, recognizes
the right and left hands and the thumb and index fingertips,
estimates their poses, and utilizes the information to achieve
interactions with the virtual models. In addition, hand strain
can be calculated.
Figure 4 Framework of the ARTM
Stereo and web cameras








3.4 CAD software module

CADM provides modelling support for design generation,
constraint information for assembly evaluation and detailed
design of the virtual models. The informative CAD model is
represented as a hierarchical part-feature-faces-edges-vertices
tree and contains additional information, e.g. the aesthetics
and materials.
During design generation, CADM performs three tasks,
namely, part creation, adding and moving of a part to an
existing part and combining parts. A new part is created when
the dimensions of a primitive have been defined. Modelling
operations are carried out according to the primitive type
automatically. For example, a 2D sketch of a rectangle



Real Object with Object
ARWCS Origin Marker

Virtual Model with

Tracked Hand


Integrated product design and assembly planning

Assembly Automation

L.X. Ng, Z.B. Wang, S.K. Ong and A.Y.C. Nee

Volume 33 Number 4 2013 345 359

Figure 5 Structure of product model in DEM

Product Model

Base Model
Model 1
Design Parameters

Design parameters

Assembly Model
Mating Constraints


Base Model-Model 1

Sequence 1

constraints 1

Base Model-Model 2

Time taken

Model 2

Base Model-Model 3


Design Parameters

Final position and


constraints 2

Base Model

Model 3

Model 1

Design Parameters

Model 2

constraints 3

Model 3

(x1 y1) is generated, followed by an extrusion depth z1 to

create a block with dimensions (x1, y1, z1). A new part is
added to an existing part when the user has added a
component to another component. The new part is positioned
on the existing part according to their relative poses. Based on
the user-defined Boolean operation, the added part will
become a feature of the existing part. The surface information
of the model in ARDnA will be updated from CADM. This
process will continue until the design is completed. Figure 6
shows the automatic design creation workflow in CADM.
Next, an assembly model of the completed design is
generated in CADM. A root part is identified and the other
parts are assembled onto it. The root part is the first part that
is created. The assembly model is formed through modifying
the product model and changing the Boolean operations from
addition to subtraction. Modification of a part during detailed
design will modify the corresponding assembly features on the
root part to ensure part fitness. Design features can be added
to the parts. When the user is satisfied with the detailed design
of all the parts, the assembly model is created by defining the
assembly constraints.

Hand Strain Index
Sequence 2

of the hand is segmented in the hue saturation value colour

space and stored in a histogram. Image pixels of the
subsequent frames are converted into a probability model
which the CamShift algorithm uses to estimate and track the
hand region. The hand contour is extracted and the hand
centre is found using distance transformation. The fingertips
are the contact points for interaction and are detected using a
curvature-based algorithm. The curvature of each point of the
hand contour is computed and the points with curvature
above a threshold are selected as fingertip candidates. The
fingertips are the five points with the longest distances from
the centre of the hand.
The left and right hands and the thumb and index fingertips
can be recognised and differentiated automatically. For an
open-hand position with the palm facing down, the thumb is
the furthest fingertip from the mean position of the five
fingertips. If the thumb is to the right of the hand centre, the
hand is recognized as the left hand and vice versa. The index
finger is the fingertip that is closest to the thumb. In BHIM,
the thumbs and the index fingers are used to achieve direct
manipulation via a pinching motion. During a pinching
motion, the thumb and index fingertips are recognized from
their relative positions to the hand centre.
The 3D poses of the fingertips are estimated after the
fingertips have been recognized. The depth information of
the fingertips is obtained using the disparity information from
the stereo camera and projecting the 3D information from the
camera coordinate system to the ARWCS.
When the user is manipulating a virtual model, the
transformation matrix of the virtual model is updated
according to changes in the rotation and translation of the
hand, which is based on the midpoint of the thumb and
the index finger Thand. The rotation matrix of the hand is
calculated with reference to a coordinate system, which is
defined at the midpoint of the thumb and the index finger,

3.5 Visualization module

VM renders the virtual models using the OpenGL library and
registers them on the markers with reference to the ARWCS.
A LCD monitor is used to display the virtual objects in the
desktop AR environment. A HMD can be used if a more
coupled modelling and visualization perspective is desired.

4. BHI and hand strain determination

4.1 Detection of hands and fingertips
The users hands are detected using the continuously adaptive
mean-shift (CamShift) algorithm (OpenCV, 2012). A region

Integrated product design and assembly planning

Assembly Automation

L.X. Ng, Z.B. Wang, S.K. Ong and A.Y.C. Nee

Volume 33 Number 4 2013 345 359

Figure 6 Automatic design creation in the CAD software module

(a) 2D sketch generation

(b) Extrusion of 2D sketch

(c) Addition of a cylinder and placement of part

(d) Combined parts

using two unit vectors, namely, V^ th!if between the thumb and
the index finger and V^ hc!mp between the midpoint of the first
vector and the centre of the hand. The x-axis is V^ th!if , the
z-axis is the cross product of V^ th!if and V^ hc!mp , and the y-axis
is the cross product of the z- and x-axes (Figure 4(a)). At the
first contact between the hand and a virtual model,
the coordinate system, CSfc, is recorded and the displacement
of the hand from the centroid of the virtual model is
recorded as T PMP!VMC . The hand rotation RCS fc !CS new is the
rotation from CSfc to the new coordinate system CSnew at
the new hand position and it is calculated using equation (1).
The resultant transformation matrix of the virtual model,
TM vmt1 , is expressed in equation (2), where
^ fc ; Y^ fc ; Z^ fc ; X
^ new ; Y^ new and Z^ new are the unit vectors of the
x-y-z axes of CSfc and CSnew, respectively:
Xnew x
RCSfc !CSnew B ^ new y
^ new z

1 0^
Xfc x
Y^ new x Z^ new x
Yfc x
Y^ new y Z^ new y C
Y^ new y Z^ new z
Z^ fc x

^ fc y X
^ fc z

TM vmt1 T hand T VMC!PMP RCS fc !CSnew

T PMP!VMC TM vmt :

4.2 Determination of hand strains

Hand strain is defined as the discomfort a user experiences at
certain hand postures. Two types of hand strain can be
captured. The first type is when the pinch width exceeds
110 mm, whereby the user can exert only 60 per cent of the
pinch strength (Imrhan and Rahman, 1995). The width of
the pinch is defined as the distance between the thumb and
the index fingertip. The second type is when the deviation
of the wrist angle u has reached a discomfort range
(Khan et al., 2010) as shown in Table I. Figure 7 shows the
various hand strain postures that can be recorded. Hand
strain is detected only when the user is manipulating virtual
models. A strain event is recorded when the hand experiences
discomfort for more than 1 s so as to differentiate a strain
from a reflex movement, and it contains information on the
maximum deviation, the dwell time and the hand in strain.
A strain event is terminated when the deviations are
below the defined thresholds. Different hand strains can be

Y^ fc y Y^ fc z C
Zfc y Zfc z


Integrated product design and assembly planning

Assembly Automation

L.X. Ng, Z.B. Wang, S.K. Ong and A.Y.C. Nee

Volume 33 Number 4 2013 345 359

Table I Discomfort range for different wrist angles

Deviation types

Range of motion (ROM)


Discomfort range


. 45% of ROM
. 45% of ROM
. 45% of ROM
. 45% of ROM
. 45% of ROM
. 45% of ROM


Source: Khan et al. (2010)

Figure 7 Hand strain postures detected and recorded using ARDnA



(b) Wide pinch strain

(a) Neutral posture with

reference coordinate system

(c) Flexural and extension strains

(d) Ulnar deviation and radial

deviation strains

(e) Pronation and supination strains

detected independently. A posture can be detected to

experience three hand strain events concurrently, e.g. a wide
pinch strain, flexural strain and pronation strain. Studies have
demonstrated the effects of combined strains (Khan et al.,
2010). However, it is difficult to obtain a formula to calculate
the total strain. Therefore, hand strains are treated as
independent strain events.

4.3 Strain from deviation of wrist angle

The wrist angle deviation u is determined as the rotation from
the coordinate systems of the neutral posture CSnp of the hand
(Figure 7(a)), which is the posture where the bones of the fingers
and forearm are roughly parallel (Khan et al., 2010), to a new
posture CSnew. The flexural/extension (F/E) angle, radial/ulnar
deviation (R/U) angle, and pronation/supination (P/S) angle

Integrated product design and assembly planning

Assembly Automation

L.X. Ng, Z.B. Wang, S.K. Ong and A.Y.C. Nee

Volume 33 Number 4 2013 345 359

^ np to
are calculated from the rotations about Z^ new of CSnew from X
^ new , about X
^ new of CSnew from Z^ np to Z^ new and about Y^ new of
^ np to X
^ new , respectively. The rotation from CSnp to
CSnew from X
^ np to
CSnew, RCSnp !CSnew , is a combination of the rotation from N
^ new with u, as
^ new , RN !N , and the rotation RN2 u about N
indicated in equation (3), where N represents the
corresponding axis to find u. RCSnp !CSnew is derived using
equation (1) and is derived from RN2 u:

fair to very poor as only undesirable hand postures are

considered. For each strain event, the percentage strain:

RN2u RCSnp !CSnew R21

N np !N new

%S i

is calculated. The mean %Si for different strain events in an

assembly step is calculated to obtain the posture ratings
and multiplier values from the hand/wrist posture column in
Table II.
The duration of exertion is the percentage of the total
durations of all the strain events over the total duration of the
assembly step:
duration of strain events
Total times taken for assembly

The P/S angle is u as pronation and supination only occur at the

wrist. For F/E and R/U angles, u consists of the rotations of the
forearm about the elbow joint, which must be eliminated. For F/
E angle, f between the two vectors V^ th!if and V^ hc!mp
is constant
^ without the
when there is only forearm rotation. The new X
forearm rotation can be obtained using three simultaneous
as shown in equation (4) based on three constraints
^ must satisfy, namely, the angle between X
^ and
that X
^ and V^ np
V^ newhc!mp must be equal to that between
^ new Y^ new plane, and Z^ must be parallel to Z^ new.
must lie in the X
^ to X
^ new about
The F/E angle will be the rotation angle from X
Z^ new :

The efforts/min is the number of strain events detected per

minute. The HSI is the product of the multiplier values of the
HSI MV HandPosture MV DurationofExertion MV Efforts=min :
A Strain Index of 5.0 is considered to be associated with
hazardous work (Moore and Vos, 2004). In ARDnA, the aim
will be to detect and reduce HSI that exceeds 5.0 during
manual assembly.

^ np V^ np
^ V^ new

max dev 2 threshold



^ Z^ new 0

^ V^ new
Z^ new 1


4.5 Bare-hand interactions

Two types of interactions are supported namely, direct
manipulation and gestures. Direct manipulation allows the
users to grab and interact with the virtual models in the same
manner as they would with real objects. Virtual spheres are
augmented on the thumb and index finger of each hand and
collision detection is performed between the spheres and the
virtual model. The colour of the spheres will change to
provide visual feedback when the users hands are in contact
with the virtual objects. When the thumb and index finger of
one hand are in contact with a virtual model, the hand is
deemed to have grabbed this virtual object. Gesture inputs are
used to trigger commands and two types of gestures are
supported, namely, the pinch gesture and the point gesture.
A pinch gesture is recognised when the distance between the
thumb and the index finger is below a certain threshold and it
is used to confirm an action and select objects. The point
gesture is achieved with the index finger, and it is used as a
cursor to interact with the virtual panel GUI. Table III
summarises the interactions in ARDnA.


For R/U angles, the ulnar deviation of the forearm is

insignificant assuming that the elbow of the user is placed on
the table top of the assembly workspace. The radial deviation
angle of the forearm, w can be calculated from the arcsine of the
average human forearm length of 26.5 cm (Chaffin et al., 2006),
over the z-coordinate of V^ newhc!mp . It is subtracted from u
to obtain the radial deviation wrist angle. It is possible that
the position of the elbow will change with a rotation of the
shoulder. However, rotation of the shoulder cannot be
captured by the system and thus there will be an error using
the current method, which can be resolved by using more
tracking devices.
4.4 Calculation of Hand Strain Index
A Hand Strain Index (HSI) is derived from the Strain Index
(Moore and Vos, 2004), and it uses three variables, namely,
hand/wrist posture, duration of exertion and efforts/min to
evaluate the hand strains of an assembly step. An assembly step
is defined as a single assembly of a component to another
component. Table II shows the rating and multiplier table of the
variables used to derive the HSI. The posture rating ranges from

5. Design generation
Using BHI to create 3D virtual models allows a user to think
and experiment with his hands. Components can be created

Table II Rating and multiplier values for HSI used in ARDnA


Hand/wrist posture



Very good (1.0)

Good (1.0)
Fair (1.5)
Poor (2.0)
Very poor (3.0)


Note: Multiplier values in parentheses


Duration of exertion (%)


,10 (0.5)
10-29 (1.0)
30-49 (1.5)
50-79 (2.0)
$80 (3.0)

, 4 (0.5)
4-8 (1.0)
9-14 (1.5)
15-19 (2.0)
$20 (3.0)

Integrated product design and assembly planning

Assembly Automation

L.X. Ng, Z.B. Wang, S.K. Ong and A.Y.C. Nee

Volume 33 Number 4 2013 345 359

Table III BHIs supported in ARDnA

Hand interaction
Direct manipulation


Tracked features

ARDnA operations


Finger and virtual models

To gain control of virtual models for

transformation operations


Hand movements

To move virtual models in the design space

Finger movements

Movement of fingers to determine the

size of CSG primitives


Two quaternions of the fingers with

respect to the centre of the hand

To rotate the virtual models


Measured using a threshold for the

distance between index finger
and thumb

(i) Command input to confirm actions

(ii) Point selection of virtual models and

Point and

3D position of index finger

To act as a cursor and select options on GUI

6. Assembly evaluation

from seven CSG primitives (block, wedge, cylinder,

cone, sphere, hemisphere and torus), manipulated and
combined to generate new designs. Direct manipulation of
the virtual models is more intuitive as compared to using a
mouse. A primitive is created by tracking the 3D poses of the
fingers of both hands to define its dimensions and using
the pinch gesture to confirm the creation. The primitives
created can be manipulated with bare hands to define their
3D poses. They can be combined based on the Boolean
operation selected by the user via the GUI after they have
been placed in a desired configuration to form a product.
The design parameters are sent to CADM via DEM for
the required CAD operations. Figure 8 shows the design
generation process.
Basic editing functions are supported in ARDnA. The user
can pinch to select the component and choose the editing
function. The size, position and type of a component can be
modified. Components can be deleted and the entire design
can be scaled to a desired size. An un-do function is
provided. For more sophisticated editing, CAD software is

ARDnA advocates a hands-on approach for assembly

evaluation, whereby the user will manually assemble a
product to evaluate the sequence and ergonomics. Assembly
evaluation begins with a root component and the next
component to be assembled. The default assembly sequence
is the order in which the components are created during design
generation. The user can grab the components and assemble
them aided by the geometrical constraints (planar coincidence
and concentric) of the components surfaces in contact
(Ong and Wang, 2011). As the user cannot manipulate the
components precisely, thresholds of 5 mm and 108 based on the
human proprioceptive position sense (van Beers et al., 1998) are
used to detect planar coincidence and concentric constraints,
respectively. When a component has been assembled, the
system will check for correctness before the next component will
be assembled. If there is an error, the error will be recorded and
the user will be notified to redo the assembly. This will continue
until the sequence is completed. The assembly sequence can be
modified by selecting a different component.

Integrated product design and assembly planning

Assembly Automation

L.X. Ng, Z.B. Wang, S.K. Ong and A.Y.C. Nee

Volume 33 Number 4 2013 345 359

Figure 8 Flowchart of the design generation phase in ARDnA

Create Primitive/Load 3D




In desired

existing primitives

Report error to

Select Boolean

Perform modelling
operations in CAD

Send modelling
information to
CAD software via



Send model information

via DEM and rebuild
model in design space


Design Completed


Assembly evaluation involves collecting data on the assembly

sequence, the assembly time taken, the number of orientation
changes, assembly errors and hand strains for each assembly
step and the assembly sequence. Data captured across
different assembly sequences is used to compare them.

model. An assembly error record consists of the error type

(incorrect position (Figure 9(b)(i)), incorrect orientation
(Figure 9(b)(ii))), components involved and the assembly
step. Feedback on the error will be provided to the user to
redo the assembly.

6.1 Time taken for assembly

The time for an assembly step starts when the components are
rendered and the user grabs one of them and ends when they
are assembled, i.e. fully constrained. When a component is
assembled wrongly, the user will have to redo the assembly
and the time will restart. Both the time taken for the correct
and incorrect assembly process will be recorded but only the
time taken for correct assembly is used for comparison
between different assembly sequences. The time taken for an
assembly sequence is the aggregation of the time of each
assembly step.

6.4 Hand strains

A hand strain event is recorded when a hand strain is
detected. Some occurrences of hand strains are shown in
Figure 9(c). The types of hand strain, maximum deviation,
hand that is in strain, strain duration, component(s)
involved and the assembly step will be recorded for each
hand strain event. For each assembly step, the HSI is
calculated and the aggregated HSIs for the entire assembly
can be used to assess the ergonomics of different assembly
The time taken reflects the ease of assembly and handling of
the components. The time measured is slower than that of
actual manual assembly but it is representative of the
differences between assembly sequences. Orientation change
in ARDnA is more stringent than the orientation changes in
DFA analysis (Boothroyd et al., 2001), but they serve the
same purpose in detecting assembly inefficiency. By detecting
orientation change using more stringent criteria, more
orientation changes can be detected and removed through
sequence or design modification. Assembly errors can be
attributed to a lack of assembly guides, similar components,
etc. These assembly errors affect the assembly efficiency and
they can be eliminated by providing guides or modifying the
a- and b-symmetry of the components. The time taken,
orientation changes and errors are generated automatically
during assembly evaluation. The hand strain detection in
ARDnA considers assembly ergonomics, which cannot be
captured using DFA analysis. The HSI provides an overview
of the strain sustained during assembly and the individual

6.2 Orientation changes

Orientation change is detected when the orientation of a
component about any of the x, y or z-axes exceeds 908
(Figure 9(a)(i)). Multiple orientation changes are detected
when the component rotates more than 908 about two axes.
Orientation change will not be detected if the sum of the
rotation about three axes exceeds 908 but none of them is
larger than 908 as shown in Figure 9(a)(ii). The orientation
change is calculated based on the initial orientations of the
components at the start of each assembly step. The
orientation change recorded consists of the rotation angle,
the rotation axis, the components involved and the assembly
6.3 Assembly errors
An assembly error is detected when the assembled component
is positioned or oriented wrongly. The poses are calculated
and verified with the correct poses stored in the assembly

Integrated product design and assembly planning

Assembly Automation

L.X. Ng, Z.B. Wang, S.K. Ong and A.Y.C. Nee

Volume 33 Number 4 2013 345 359

Figure 9 Assembly evaluation in ARDnA

(ii) Orientation change not detected

(i) Orientation change detected when
rotation about Z-axis exceeds 90
(a) Orientation changes detection

(i) Component assembled in wrong position

(ii) Component assembled in

wrong orientation

(b) Assembly errors detection

(i) Extension strain detected

(ii) Wide pinch strain detected

(c) Hand strains detection


Integrated product design and assembly planning

Assembly Automation

L.X. Ng, Z.B. Wang, S.K. Ong and A.Y.C. Nee

Volume 33 Number 4 2013 345 359

strain events can be analysed to understand the causes and

determine the solutions to eliminate them, e.g. improving the
component access or changing the sequence.

An initial assembly sequence of body-nose-wheel1-wheel2wheel3-wheel4-motor-spoiler is generated. Figure 10 shows

the design process of the toy car.
Figure 11 shows that the car design has misaligned
components. This is due to the inability of the human hands
to place the objects precisely. This is corrected by modifying
the component placements, followed by the detailed design in
CADM (Figure 11). For the body, slots are added to insert the
spoiler and shafts for connecting the wheels. For the nose,
a base is added. For the wheel, a rim is added and a hole is
created to fit the shaft. The motor is not redesigned as it is
predefined. The spoiler is added with legs so that it can be
slotted into the body. The assembly model of the car is
generated with the defined mating constraints.
The assembly evaluation phase commences when the body
and nose are rendered and the nose is fitted with the body.
When three surfaces of the nose are constrained with those of
the body, the system will check the pose of the nose. If it is
correct, the next component can be assembled. The wheels,
motor and spoiler are assembled sequentially (Figure 12).

7. Case study
A case study on the design and assembly of an electric toy car
was conducted to demonstrate the ARDnA methodology and
system. The design requirements are the car must be of
similar size to an existing toy car, a predefined electric motor
must be used and it must be easy-to-assemble.
The designer starts by creating a block primitive as the body
using an existing toy car as a spatial reference. The front nose
is created using a wedge and the wheel using a cylinder.
As more than one wheel is required, the wheel is duplicated
and placed at different locations. The motor is added using a
real motor. The real motor is tracked in the ARWCS by
affixing marker onto it (Figure 10(c)). The designer
manipulates the motor and adds it to the virtual model of
the car, using both real and virtual objects. Finally, a spoiler
(a wedge) is created and placed on top of the motor.
Figure 10 Design generation of a toy car in ARDnA

(a) Creating a block for the chassis and a wedge for the nose

(b) Loading and adding the wheel to the toy car

(c) Adding the real motor to the toy car

(d) Adding the spoiler to the toy car

(e) The completed toy car


Integrated product design and assembly planning

Assembly Automation

L.X. Ng, Z.B. Wang, S.K. Ong and A.Y.C. Nee

Volume 33 Number 4 2013 345 359

Figure 11 Detailed design of the toy car

Final Car Design

Car Design with

correct alignment

Original Car Design

(a) Correction of the alignment and the final car design

Car Body Design

Nose Design

Spoiler Design

Wheel Design
(b) Detailed design of the individual components

When the assembly is completed, the evaluation results are

During the assembly of the spoiler, a hand strain event
is recorded and the strain parameters are: maximum
deviation of 125 mm, strain duration of 7 s and assembly
time of 24 s. The %S1 is ((125-110)/110) 100% 14%,
which maps to a bad posture and a multiplier value of 2.0.
The duration of exertion is (7/24) 100% 30% which has
a rating of 3 and multiplier value of 1.5. The effort per
minute is (1/24) 60 2.5 which has a rating of 1 and
multiplier value of 0.5. The HSI is calculated to be
2.0 1.5 0.5 1.5. The strain is caused by the rear
wheels blocking the access to the body. Therefore, the
sequence has been changed to body-nose-motor-spoilerwheels. Table IV shows the comparative results of the two
assembly sequences.

made in the design generation phase. The components can

be created with assembly features that are derived from
the interfacing components, and design and assembly
modifications on the corresponding assembly features will
be updated automatically to ensure assembly fit. This
reduces the need to modify the design and assembly models
individually. Assembly evaluation is performed where users
can manipulate virtual components via BHI, and design
changes can be made in real time, thus integrating product
design with assembly design.
The current ARDnA system is suited for design and
assembly on a workbench which limits the type and the size of
products. Future implementation will attempt to increase the
design space by adding more tracking cameras. In addition,
the component cannot be too small such that the hands
are not able to grab it. A possible improvement will be to
introduce tools to manipulate such components. Other
common assembly tools, such as screwdriver and wrench,
can be introduced for assembly evaluation. The marker-based
tracking method used in ARDnA is affected by occlusions and
lighting changes. An improvement will be to implement
marker-less methods to increase the tracking robustness.
In addition, there is no haptic feedback from the virtual
models, which reduces the realism of the interaction. Haptic
devices may be implemented but this will increase the cost
and encumber the user. The trade-off between these two
aspects will have to be balanced for future implementation.

8. Discussions and conclusion

In this paper, an integrated design and assembly planning
methodology has been proposed and the ARDnA system
presented. The proposed BHI methodology supports intuitive
design and assembly through a combination of direct
manipulation and simple gestures. Direct manipulation allows
the user to grab and manipulate the virtual models. A case study
has been conducted to demonstrate the proposed methodology
and system.
The integration of design generation and assembly
planning and evaluation in ARDnA supports both activities
in a single environment. Assembly considerations can be

Integrated product design and assembly planning

Assembly Automation

L.X. Ng, Z.B. Wang, S.K. Ong and A.Y.C. Nee

Volume 33 Number 4 2013 345 359

Figure 12 Assembly evaluation of the toy car

(b) Concentric constraint

for assembling the wheel

(a) Coincident plane constraints for assembling

the nose

(c) Assembly issue when assembling the spoiler (detected wide pinch
hand strain and interference from the rear wheels)

(d) Alternative sequence of assembling

the spoiler before the rear wheels

(e) Assembled toy car

Table IV Comparative results of two assembly sequence used in case study

Total time taken (s)
No. of orientation changes
No. of errors

Initial assembly sequence

Amended assembly sequence

1; at body-wheel3
Body (1808)
1.5; at body-spoiler
Wide pinch strain of 14%S of duration 7s

1; at body-wheel3
Body (1808)


ARToolkit (2007), available at:

artoolkit/ (accessed 12 February 2012).
Boothroyd, G., Dewhurst, P. and Knight, W.A. (2001),
Product Design for Manufacture and Assembly, 2nd ed., CRC
Press, Boca Raton, FL.
Chaffin, B.D., Andersson, G.B.J. and Martin, B.J. (2006),
Occupational Biomechanics, Wiley, Hoboken, NJ.

Andersen, M., Andersen, R., Larsen, C., Moeslund, T. and

Madsen, O. (2009), Interactive assembly guide using
augmented reality, Proceedings of the 5th International
Symposium on Advances in Visual Computing, Springer,
Berlin, pp. 999-1008.

Integrated product design and assembly planning

Assembly Automation

L.X. Ng, Z.B. Wang, S.K. Ong and A.Y.C. Nee

Volume 33 Number 4 2013 345 359

Chimienti, V., Iliano, S., Dassisti, M., Dini, G. and Failli, F.

(2010), Guidelines for implementing augmented reality
procedures in assisting assembly operations, Precision
Assembly Technologies and Systems, pp. 174-179.
Chryssolouris, G., Mavrikios, D., Fragos, D. and Karabatsou, V.
(2000), A virtual reality-based experimentation
environment for the verification of human-related factors in
assembly processes, Robotics & Computer-Integrated
Manufacturing, Vol. 16 No. 4, pp. 267-276.
Coutee, A.S., McDermott, S.D. and Bras, B. (2001),
A haptic assembly and disassembly simulation
environment and associated computational load
optimization techniques, Journal of Computing and
Information Science in Engineering, Vol. 1 No. 2, pp. 113-122.
Gupta, R., Whitney, D. and Zeltzer, D. (1997), Prototyping
and design for assembly analysis using multimodal virtual
environments, Computer-Aided Design, Vol. 29 No. 8,
pp. 585-597.
Imrhan, S.N. and Rahman, R. (1995), The effects of pinch
width on pinch strengths of adult males using realistic
pinch-handle coupling, International Journal of Industrial
Ergonomics, Vol. 16 No. 2, pp. 123-134.
Jayaram, S., Jayaram, U., Lyons, K. and Hart, P. (1999),
A virtual assembly design environment, Proceedings of
IEEE Virtual Reality, pp. 172-179.
Jayaram, S., Jayaram, U., Kim, Y.J., DeChenne, C.,
Lyons, K.W., Palmer, C. and Mitsui, T. (2007), Industry
case studies in the use of immersive virtual assembly,
Virtual Reality, Vol. 11 No. 4, pp. 217-228.
Khan, A.A., OSullivan, L. and Gallwey, T.J. (2010), Effect
on discomfort of frequency of wrist exertions combined
with wrist articulations and forearm rotation, International
Journal of Industrial Ergonomics, Vol. 40 No. 5, pp. 492-503.
Kim, C.E. and Vance, J.M. (2004), Development of a
networked haptic environment in VR to facilitate
collaborative design using Voxmap Pointshell (VPS)
software, Proceedings of the ASME Design Engineering
Technical Conference, pp. 19-25.
Lee, H., Billinghurst, M. and Woo, W. (2010), Two-handed
tangible interaction techniques for composing augmented
blocks, Virtual Reality, Vol. 15 Nos 2/3, pp. 133-146.
Lim, T., Ritchie, J.M., Dewar, R.G., Corney, J.R., Wilkinson, P.,
Calis, M., Desmulliez, M. and Fang, J.J. (2007), Factors
affecting user performance in haptic assembly, Virtual
Reality, Vol. 11 No. 4, pp. 241-252.
Liverani, A., Amati, G. and Caligiana, G. (2004), A CADaugmented reality integrated environment for assembly
sequence check and interactive validation, Concurrent
Engineering, Vol. 12 No. 1, pp. 67-77.

Moore, J.S. and Vos, G.A. (2004), The Strain Index, in

Stanton, N., Hedge, A., Brookhuis, K., Eduardo Salas, E.
and Hendrick, H. (Eds), Handbook of Human Factors and
Ergonomics Methods, CRC Press, Boca Raton, FL,
pp. 9-1-9-5.
Ong, S.K. and Wang, Z.B. (2011), Augmented assembly
technologies based on 3D bare-hand interaction,
CIRP Annals Manufacturing Technology, Vol. 60 No. 1,
pp. 1-4.
Ong, S.K., Pang, Y. and Nee, A.Y.C. (2007), Augmented
reality aided assembly design and planning, CIRP Annals
Manufacturing Technology, Vol. 56 No. 1, pp. 49-52.
OpenCV (2012), available at: http://opencv.willowgarage.
com/wiki/ (accessed 12 February 2012).
OpenGL (1997), available at: (accessed
12 February 2012).
Raghavan, V., Molineros, J. and Sharma, R. (1999),
Interactive evaluation of assembly sequences using
augmented reality, IEEE Transactions on Robotics and
Automation, Vol. 15 No. 3, pp. 435-449.
Seth, A., Su, H.J. and Vance, J.M. (2006), SHARP: a system
for haptic assembly and realistic prototyping, Proceedings of
the DETC, Vol. 6, pp. 10-13.
Seth, A., Vance, J.M. and Oliver, J.H. (2010), Virtual reality
for assembly methods prototyping: a review, Virtual
Reality, Vol. 15 No. 1, pp. 5-20.
SolidWorks Corporation (2012), SolidWorks API, available
(accessed 12 February 2012).
Valentini, P.P. (2009), Interactive virtual assembling in
augmented reality, International Journal on Interactive
Design and Manufacturing, Vol. 3 No. 2, pp. 109-119.
van Beers, R.J., Sittig, A.C. and Denier van der Gon, J.
(1998), The precision of proprioceptive position sense,
Experimental Brain Research, Vol. 122 No. 4, pp. 367-377.
V-Collide (1998), available at:
V-COLLIDE/ (accessed 12 February 2012).
Wan, H., Gao, S., Peng, Q., Dai, G. and Zhang, F. (2004),
MIVAS: a multi-modal immersive virtual assembly
system, Proceedings of the ASME Design Engineering
Technical Conference, Salt Lake City, UT, pp. 113-122.
Wiedenmaier, S., Oehme, O., Schmidt, L. and Luczak, H.
(2003), Augmented reality (AR) for assembly processes
design and experimental evaluation, International Journal
of Human-Computer Interaction, Vol. 16 No. 3, pp. 497-514.
Zhang, J., Ong, S.K. and Nee, A.Y.C. (2011), RFIDassisted assembly guidance system in an augmented reality
environment, International Journal of Production Research,
Vol. 49 No. 13, pp. 3919-3938.

To purchase reprints of this article please e-mail:

Or visit our web site for further details: