You are on page 1of 12

Advanced Engineering Informatics 43 (2020) 100993

Contents lists available at ScienceDirect

Advanced Engineering Informatics


journal homepage: www.elsevier.com/locate/aei

Full length article

BIM-based task-level planning for robotic brick assembly through image- T


based 3D modeling
Lieyun Ding, Weiguang Jiang, Ying Zhou, Cheng Zhou , Sheng Liu

School of Civil Engineering & Mechanics, Huazhong University of Science & Technology, Wuhan, Hubei, China
Hubei Engineering Research Center for Virtual, Safe and Automated Construction (ViSAC), Wuhan, Hubei, China

ARTICLE INFO ABSTRACT

Keywords: The application of robotics in the assembly of building bricks has become a popular topic, while the planning of
Robotic brick assembly construction robots is still lagged far behind the manufacturing industry. New robotic assembly task with manual
Task-level planning teaching–planning method is always time consuming. A task-level planning method was proposed, and the
BIM implementation details were described to improve the planning efficiency of robotic brick assembly without
Image-based 3D modeling
affecting accuracy. In this work, a BIM (Building Information Model)-based robotic assembly model that contains
all the required information for planning was proposed. Image-based 3D modeling was utilized to help the
calibration of the robotic assembly scene and building task models. The placement point coordinates of each
assembly brick were generated in the robot base coordinate system. Finally, three different building information
model tasks of modular structures (e.g., wall, stair, and pyramid) were designed. The feasibility and effectiveness
of the proposed method were verified by comparing the efficiency and accuracy of three models through manual
teaching and task-level planning.

1. Introduction considerable amount of time and manpower when dealing with a large
number of assembly components. Thus, a multi-task planning method is
Robotics is an advanced form of mechanization (automation), in crucial to generate controlled instructions for robotic assembly tasks,
which industrially important operations are automated [1]. In the especially in the production of prefabricated components in factories.
labor-intensive construction process, especially in the assembly of Robotic assembly can be considered a “pick and place” process for
prefabricated components with simple shape (e.g. bricks, blocks, and each brick, and the planning of the assembly task refers to the calcu-
plates), intelligent robots can increase construction efficiency, and lation and sorting of the placement point 3D coordinates of each brick
workers can avoid muscular strain caused by repetitive work [2]. Ro- in the robot coordinate system [8]. To achieve rapid planning for dif-
botic assembly is also considered a promising approach for constructing ferent assembly tasks, the following issues must be solved: (i) unifica-
human settlements on other planets, such as the Chinese lunar base [3]. tion and calibration of Cartesian coordinates, including the assembly
At present, various assembly robots have been designed and placed robot, construction site, and the assembly task; (ii) a parametric ex-
in on-site service [4,5]. Among these robots, IF1, which is designed by pression of the assembly task; and (iii) model information converted
the ETH research team, is an outstanding assembly robot that has into robotic control instructions. Some studies have provided their so-
achieved the autonomous assembly of a stipulated wall in an un- lutions in these difficulties. For issue 1, a visual calibration method,
structured construction site [6]. However, a significant problem has such as the “eye-in-hand” or “eye-to-hand,” can help the robot establish
emerged: How do we quickly plan and execute diverse, personalized autonomous positioning systems [9]. 3D laser scanning method de-
robotic assembly tasks? To date, only one assembly robot is suitable for termines position by reconstructing the space [10]. For issue 2, most
a particular building product. The use of several single-task assembly studies remained in the application stage; at this stage, the model
robots is necessary when producing different forms of building tasks. language, such as “obj, dwg, and 3 dm,” is used to express the place-
Another option is to write and test new operating instructions that are ment point information for robotic assembly [11,12]. However, the
artificially accompanied by the implementation of each assembly task. problem of data exchange among different software platforms remains
Manual teaching planning is a common robotic planning method [7]. unresolved. In other words, supporting the need for large quantities of
Although this method is easy to operate, it always consumes assembly tasks through a unified format is difficult. Therefore, the core


Corresponding author.
E-mail address: charleschou@163.com (C. Zhou).

https://doi.org/10.1016/j.aei.2019.100993
Received 10 April 2019; Received in revised form 4 August 2019; Accepted 27 September 2019
1474-0346/ © 2019 Published by Elsevier Ltd.
L. Ding, et al. Advanced Engineering Informatics 43 (2020) 100993

of multitask planning for robotic assembly is to define a common task manufacturing scenarios due to its efficiency in reprogramming [12].
expression paradigm. On this basis, an interface program for issue 3 At present, some studies describe the assembly task through parametric
needs to be developed to convert a unified task format into robotic modeling software and generate the placement point 3D coordinates for
control instructions. each brick [23]. However, the limitation of single-modeling software is
Building information model (BIM) has formed unified expression that it can only provide the original coordinate information of the as-
and expansion rules for IFC-based construction tasks. The content of sembly element, and the robot assembly progress, site, equipment, or
BIM is not only limited to 3D geometry, but extends to all construction even process quality are undefined. Therefore, a complete information
activities, including progress, quality, and even the necessary sensor description model must be established from the architectural semantic
equipment. In the field of automatic construction, Davtalab used IFC as layer for task-level planning.
the basic exchange format of robot 3D printing software [13]. Some
studies have used IFC as a data exchange file for robotic wall assembly 2.2. BIM for automatic assembly
[14]. However, detailed descriptions of the key information that should
be included in this file have not been provided. In response to these BIM has become a common expression method of building tasks.
deficiencies, the major contribution and work of this paper include: (i) The 3D model is formed by overlaying various information (e.g., time,
the IFC extension file for robotic assembly, including task geometry cost, built process, sustainability, and safety) to guide the construction
information, progress information, site information, and robot in- process at different stages [24]. BIM is also an effective tool for sup-
formation; and (ii) description of the conversion process from the IFC porting the production and assembly of prefabricated structures. In the
design file to the robot control instructions. IFC standard, building smart has expanded the “AssemblyElement” fa-
The rest of this work is organized as follows: Section 2 reviews the mily to clarify the information of assembly components [25]. Some
research progress of the robotic assembly, BIM, and 3D reconstruction ongoing studies focus on the planning of building assembly tasks using
technology. Section 3 describes the method of task-level planning. BIM data. For example, the assembly sequence of prefabricated com-
Section 4 discusses the experimental verification and analysis results. ponents can be automatically generated through the geometric and
Section 5 is a discussion, and Section 6 summarizes this work. spatial information contained in BIM [26]. Some scholars have used the
genetic algorithm to minimize assembly time based on BIM [27].
2. Literature review More advanced technologies have been combined with BIM to im-
prove the automation level of the assembly process. For example, BIM is
2.1. Robotic assembly in construction regarded as the management platform of assembly construction in
practical projects through the integration of GIS and RFID technologies
In construction, approximately 70% of the total project time is used [28,29]. The use of IoT for automating construction monitoring systems
for assembly operation [15]. Therefore, the efficiency of construction is explored in several recent studies, such as the work done by Teizer
can be increased when building elements are assembled with robots. et al., in which real-time performance data related to environmental
“SAM 100” intelligent building robot developed by Construction Ro- and localization data are integrated into a cloud-based BIM platform
botics (USA) has been used in the masonry mortar of the wall to in- [30]. The integration of the aforementioned automation technologies is
crease production efficiency [4]. Robot “Hadrian X,” which is devel- due to the expansion of relevant underlying description paradigms
oped by Fast-brick Robotics (AUS), makes the assembly of an entire through IFC. This expansion allows the smooth exchange of necessary
house possible [5]. Single-task assembly robots in different institutional construction information among different engineering software. Still,
forms are designed to serve various construction phases, such as tile- others attempt to break the interaction boundary between BIM and
laying, curtain wall installation, wooden-structure assembly, aerial as- robots to improve the efficiency of automated design to automatic
sembly, and segment assembly robots [1,16-18]. Moreover, a bionic- construction. For instance, a BIM-integrated software platform for ro-
based robotic assembly team was investigated to achieve specific botic construction is designed for the rapid planning of 3D robot
human-designed goals [19]. Samuel Wilkinson suggested that swarm printing, whereas the information described by IFC, including geo-
robotics served the construction for the Mars outpost [20]. metry, material type, and component parameters, are extracted and
Trajectory planning, which is a crucial issue for robotic assembly, used for planning the printing process [13]. BIM should become the
can be classified into three types: physical-level, motion-level, and task- first choice for task-level planning through IFC standards based on the
level planning [21]. Physical-level planning addresses the decomposi- current research.
tion of the operation based on the robotic kinematic mechanism (mo- The description of BIM task models for robotic assembly includes
tion freedom). This step is always automatically accomplished by the three aspects: construction site, robot property, and assembly task in-
industrial control system. Furthermore, motion-level planning focuses formation [10]. Expressing real-time on-site information is difficult due
on the decomposition of the working path, which can be discrete into a to the changing properties of the construction site in time and space.
large number of control points on one path. Thus, many optimized al- Determining the collect site information becomes a key step.
gorithms, such as simulated annealing, artificial potential field, gra-
phics, and bionics, have been investigated to meet a series of planning 2.3. Scene reconstruction in engineering
requirements, such as obstacle avoidance or the shortest route [22].
Task-level planning includes the decomposition of the assembly task to Construction sites can be considered an unstructured scene because
form a set of ordered steps that a robot can perform. The core of which they constantly evolve and dramatically change in shape and form in
is to transform the task language into robotic control instructions. The response to construction tasks [9]. Therefore, the actual measurement
two previous levels of planning methods are already at a high level of of the engineering scene has become an indispensable part of con-
automation in the industry [20]. Multitask planning is an urgent pro- struction. Currently, the undertaking of this task in the field is primarily
blem in manufacturing or construction. performed with four technologies: tape measuring, total station sur-
Manual teaching or programming is an inefficient and costly way veying, laser scanning, and image-based 3D modeling [31]. Among
for robotic work. For example, manually programming a robotic arc these technologies, tape measuring is simple but time-consuming and
welding system for the manufacturing of a large vehicle hull expends not always accurate. Total station surveying needs a large amount of
more than eight months, while the cycle time of the welding process target point data and is expensive in terms of labor costs. Thus, this
itself is only 16 h [7]. The manual method is not suitable for building method is only suitable for small-scale projects. Laser scanning is the
products with diverse characteristics. As a result, a model-driven ro- most accurate, but its primary downside is the high costs of equipment.
botic planning approach can replace the traditional manual method in Compared with other technologies, image-based 3D modeling

2
L. Ding, et al. Advanced Engineering Informatics 43 (2020) 100993

Fig. 1. Methodology and framework flow.

technology has remarkable advantages in measuring the construction original IFC was extended to add some necessary robot attributes.
scene. It is inexpensive, easy to use, and time efficient, considering that Finally, the planning information contained in the IFC was extracted
the scene is rapidly reconstructed by only using simple images. This through the interface program development to complete task planning.
method is also suitable for large-scale construction scenarios due to its The core of the method was divided into three parts: (i) establishing
operability with UAV. the IFC-based robot assembly task description model that contained all
Image-based 3D modeling has been widely used in the construction the information needed for planning; (ii) supplementing the spatial
engineering field. The 3D reconstruction model contains a large amount information required in the IFC file and completing the placement point
of geometric information that can serve various kinds of build-up ac- calibration of each assembly element; and (iii) generating a robot
tivities, such as construction site visualization [32], progress mon- control program according to the IFC file that contained construction
itoring and assessment [33], quantity measurement and takeoff [34], site, robot property, and task information. Fig. 1 illustrates the pro-
and geometric quality control and verification [35]. In addition, the 3D posed framework.
reconstruction model also contains a large amount of semantic in-
formation. The addition of semantic information can promote the for- 3.1. BIM-based assembly task description
mation of the scene’s BIM model. For instance, Dimitrov added material
category information in 3D reconstruction models via image-based 3D Brick assembly is an indispensable task in building. The artificial
modeling technology [36]. Liu developed a method to rapidly acquire
geometric and position information of construction objects from cam-
eras and generated site models through the acquired information [37].
Previous studies have demonstrated the ability of image-based 3D
modeling to rapidly and accurately reconstruct engineering scenes. A
large number of studies are also conducting engineering analysis to
combine scene reconstruction models with BIM [38]. However, image-
based 3D modeling is rarely used in automatic construction. Image-
based 3D modeling is applied to rapidly reconstruct the real-time model
of the robotic assembly scene according to the method proposed in this
work. Furthermore, the scene model will be imported into the BIM
platform for parameterized analysis and calculation, thereby providing
a basis for unifying the coordinate system of the robot, site, and BIM
assembly task.
In summary, the parameterized establishment and semantic ex-
pression of the building task and robotic scene model are important in
the task-level planning of robotic brick assembly. Thus, this work
combines BIM technology with image-based 3D modeling technology.
In the following section, a comprehensive framework of task-level
planning is proposed. Similarly, the IFC-based robotic assembly model
is also proposed. The reconstruction details of the robotic assembly
scene are specifically described through image-based 3D modeling, and
the development process of robotic assembly instructions are presented
in detail.

3. Methodology

This work aims to propose a task-level planning method to convert a


building model into robotic control instructions. BIM is the core of the
method framework. A more complete description model of the robot
assembly task was formed by integrating construction site, robot
property, and task information. To facilitate the exchange of data, the Fig. 2. Robotic assembly process.

3
L. Ding, et al. Advanced Engineering Informatics 43 (2020) 100993

process includes copying the flat line, placing the sample brick, setting assembly structure. The progress information was arranged in the order
counting poles, and bricklaying. A worker can only complete his work of the assembly elements, which could be performed by BIM work
smoothly when he knows “where to build, what to build, and how to schedule software, such as Naiviswork. The site information, which was
build.” This question must be answered by assembly robots. Therefore, usually acquired by importing real-time point cloud file calibration, was
the robotic assembly system for the proposed method includes the the actual robot workspace. After establishing the information model,
following parts: a sensing subsystem, a planning subsystem, and an the coordinate system of the individual assembly task will also be
actuator (manipulator and jig). Fig. 2 shows a robotic assembly process, mapped to the actual site coordinate system.
and the most critical step is describing the assembly task so that a robot The new “ifcRobotAssembly” entity fills in the blanks on robotic
can rapidly and accurately execute the task. assembly information, and its attribute information included the robot
At present, BIM addresses all the information regarding assembly control system, robot geometry model, and robot coordinate system. To
tasks through IFC standards. However, a unified description framework complete the operation of task-level planning, this extension also de-
to define all the information needed for robotic assembly is still lacking. fined the “ifcRobotWorkSchedule” entity to represent the assembly
In tracking this problem, a new description model for robotic assembly order of the robot, and the corresponding robotic control instructions
tasks is established to solve the problem of information interaction will also be recorded in this part.
among different BIM software platforms. This new model includes task
geometry information, schedule information, site information, and
robot information (Fig. 3). The IFC standard already addressed the first 3.2. Scene modeling and calibration
three kinds of information and how they are related to one another, but
the definition of robot information is still lacking. Nevertheless, IFC Scene modeling and calibration were used to accurately describe the
provides three extension mechanism models: (1) based on an IFC proxy positional relationship of the robot, construction site, and assembly task
entity extension; (2) based on an attribute set extension; and (3) adding in the space. The real-time construction scene information must be
a new entity definition [39]. Therefore, this work defines a new entity collected due to the uncertainty of the robot’s position and the un-
based on the short board, called “ifcRoboticAssembly,” which forms a structured features of the construction space. In this process, image-
complete description framework for robotic assembly tasks. based 3D modeling is adopted to satisfy the requirements of a large-
Building task geometry information, progress information, and site scale space for rapid and high-precision scene information collection.
information have been defined in the original abstract entity. For ro- To accurately express the scene model and calibrate the necessary
botic assembly tasks, the geometric information must specify the co- nodes, the signalized (coded) target method is used to further process
ordinate information of the placement points for assembly elements, the reconstructed model [40]. To accurately express the absolute po-
which could be customized by the designer according to the form of the sitional relationship between the robot base coordinates and the con-
struction site, this work used “pre-installed signalized (coded) targets”

Fig. 3. IFC-based robotic assembly model.

4
L. Ding, et al. Advanced Engineering Informatics 43 (2020) 100993

Fig. 4. Generation of the 3D point cloud model.

to mark the corner points of the robot base and construction planes. 3.3. Robotic control instructions generation
Subsequently, image-based 3D modeling was executed as follows [31]:
(i) robotic assembly scene data (depending on the camera) were col- The information of an assembly task model, work schedule, site, and
lected; (ii) the robotic assembly scene’s point cloud model was gener- robot will be gradually integrated into the IFC files, and the application
ated; and (iii) the scene model was imported into the BIM platform. process is shown in Fig. 6. The final IFC file also contains the placement
Fig. 4 describes the key steps involved in image-based 3D modeling 3D coordinates of each brick. Subsequently, the key step is to translate
technology in detail. these placement points into control instructions that a robot can exe-
The quality of the captured scene images determines the effective- cute. Erenow, an appropriate programming language, is selected for
ness and accuracy of the 3D reconstruction model. The main factors robotic assembly.
that affect the quality of the image-based 3D modeling include [36]: (i) Task-oriented robot programming statement called Rapid is used to
setting of the shooting parameters, maintaining a single focal length, write these instructions. The entire path can be carried out as long as
sufficient exposure time, and keeping a higher pixel mode at the same the coordinates of the initial and placement points are determined [41].
time; (ii) selecting the shooting viewpoint, setting a sufficient shooting The file of the assembly instructions contains a list of sequential in-
point around the actual construction area, and selecting a continuous structions for the robot to assemble the designed building task. For
shooting point (the overlapping region of the contiguous image should example, the following assembly instructions define the placement of
maintain at 60% or more); and (iii) meeting the requirements of the the 3D coordinates of each brick in the “Main Module.” “PROC Wall”
shooting environment and maintaining the stability of the light source represents the assembly instructions of the wall. The brick is grabbed in
of the shooting environment. The photographs must include the robot “pick01” point, lifted to “lift01” point, moved to “move01” point, and
base plane and the actual construction site plane in the information finally placed to “place01” point. For each brick, only the placement
collection process of the robotic assembly scene, which also prepares point is unique, and this point is the exact coordinate obtained through
the calibration of the relative position between the robot and the con- the task-level planning. Thus, once the coordinates of the placement
struction site. After collecting scene data, the 3D cloud point model of point are obtained, the control instructions of the robot can be com-
the robotic assembly scene, which included four key steps: camera ca- pleted by any of the following procedures for any structure.
libration, feature correspondence, structure from motion (SfM), and
multiview stereo (MVS), was generated [31]. SfM estimates 3D struc-
tures (sparse point cloud) from a set of 2D images. Then, the sparse
point cloud can be generated by SfM technology. MVS converts a
coefficient point cloud into a dense point cloud. A dense point cloud
will contain more scene information details so that subsequent mea-
surements of the robotic assembly scene can perform better. The re-
sulting robot assembly scene file will be directly imported into the BIM
platform.
For the “pick-and-place” task, the core of robotic brick assembly
planning is to calculate the placement point of each brick’s 3D co-
ordinates. This method requires the building task model to be unified
into the robot base coordinate system for measurement and calculation.
Fig. 5 shows the three steps of the calibration process: Step (1) Cali-
bration of the scene model: The robot base coordinate system was
marked as a world coordinate system; Step (2) Matching of the scene
model and the task model: The four corner targets of the virtual task
construction space were matched with the four preset targets, so that
the construction task is matched to the robot coordinate system; Step
(3) Generation of the assembly point: The initialization definition of
each assembly brick placement point is completed in the BIM platform.
Subsequently, the information required for the robotic assembly plan-
ning is supplemented to form an IFC data file. The final control in-
structions of the robot will be generated through the development of
the interface program.

5
L. Ding, et al. Advanced Engineering Informatics 43 (2020) 100993

Fig. 5. Generation process of the robotic brick assembly plan.

Fig. 6. Application process of the IFC-based robotic assembly model.

Control instructions have for main types: 4. Experiments

(1) CONST represents the coordinates that define each placement The authors experimented with the process of task-level planning to
point. adapt the theory to reality. The experiment platform included an ABB
(2) Move L represents the linear motion of each brick, which also in- IRB6700-235 robot (6-DOF), a construction plane (approximately
cludes the motion parameters: position, velocity, area, and tool. 1.5 m × 0.9 m), a scene modeling camera (Sony a5100), and a mod-
(3) Wait Time represents the waiting time of the robot. eling computer (Dell Precise). The construction space was approxi-
(4) Set do1/Reset do1 represents the signal for clamping and releasing mately 8 square meters (Fig. 7). At the software level, the design of the
each brick, respectively. 4D building task model used Revit and Navisworks, and image-based
scene reconstruction was completed through Recap-photo. The place-
Finally, an interface program is designed to rapidly convert the ment point 3D coordinates were generated by Rhino. Finally, the ro-
point 3D coordinate data in the IFC file to the robot control instructions. botic control instruction was based on the Robot-studio platform, which
All steps based on the task-level planning can be completed for robotic is called “Rapid.”
brick assembly using the logical framework. In addition, three different BIM task models were designed in this
work to verify the efficiency and accuracy of task-level planning.

6
L. Ding, et al. Advanced Engineering Informatics 43 (2020) 100993

Fig. 7. Experimental scene.

Fig. 8. Designed BIM task models.

Table 1 was omitted in the experiment. The number of bricks for these plans
Description of three BIM task models. were also distinguished to reflect the difference in planning the time of
BIM task Number of Bricks Model volume various volume tasks. Table 1 shows the basic information of three
different BIM assembly tasks.
Wall model 12 96 dm3
Stair model 20 160 dm3
Pyramid model 30 240 dm3 4.2. Task-level planning

First, the four corners of the robot base and construction planes
were marked through signalized (coded) targets before modeling the
Simultaneously, planning time and placement point 3D coordinates of scene. Then, 48 photographs were taken around the construction scene
three task models were recorded by experiments that were repeated while ensuring at least 60% overlap in adjacent images. Through the
thrice. The validity of the proposed method was verified by comparing collection of image information, the scene modeling was carried out in
the time and point precision of a manual teaching method. the process of data import, sparse reconstruction, and dense re-
construction based on the Recap-photo. Finally, the scene model was
4.1. Assembly task description generated and stored in the IFC file (Fig. 9).
Second, the reconstruction scene model was imported into Rhino,
Nearly all the structures of building tasks discussed in this work are and the robot base coordinate system was calibrated by the signalized
relatively neat (except for curve structures). Three different BIM task (coded) targets in four corners. Afterwards, three BIM task models were
models, namely, wall, stair, and pyramid, were designed by Revit. All also imported into Rhino, which was unified into the virtual construc-
the building tasks were assembled using the same set of bricks. Each tion model according to the four corner points of the construction
BIM model contained its basic geometric parameters, work schedule plane. According to the parametric calculation ability of Rhino, the
information, and the grab point (Fig. 8). Emphasizing that the con- placement point 3D coordinates of different BIM task models were
nections between each brick used a “Mortise and Tenon joint structure” generated by one key. The whole process is shown in Fig. 10.
is important to enable the assembly model to meet the construction Third, the placement point of each brick’s 3D coordinates were
strength. It is a traditional Chinese-style architectural structure that extracted as an “excel file” after they were generated. The file was
ensures overall stability through staggered joints, and gaps are filled called “BIM-Rapid inline template” to generate a set of Rapid control
with mortar. This work focused on trajectory planning. Hence, mortar instructions (Fig. 11). Fig. 12 shows the effectiveness of these

7
L. Ding, et al. Advanced Engineering Informatics 43 (2020) 100993

Fig. 9. 3D scene reconstruction.

Fig. 10. Assembly plan generation.

instructions in the Robot-studio platform according to the task-level the validity of the proposed method.
planning method.
4.3.1. Planning efficiency analysis
4.3. Results and comparative analysis The time required for task-level planning was divided into two
parts: image-based 3D modeling and assembly plan generation. When
The data used for quantifying the efficiency and accuracy of the the position of the robot and the construction site remains unchanged,
task-level planning of three BIM task model were recorded by the au- only one scene model can support the planning of various tasks. Three
thors. The collected data included the task-level planning time and the repetitive experiments of task-level planning were carried out to reduce
placement point 3D coordinates, which were obtained through three the influence of uncertainties in the data acquisition process. The time
experimental trials. Moreover, the comparative analysis of manual (seconds) required for the planning of the three task models using task-
teaching planning with three BIM task models was simultaneously level planning method is shown in Tables 2 and 3.
conducted. All the experiment dates were used as the basis of verifying The time required for manual teaching planning is mainly related to

8
L. Ding, et al. Advanced Engineering Informatics 43 (2020) 100993

Fig. 11. BIM–Rapid inline template.

Fig. 12. The interface of control instructions.

Table 2 Table 3
Total time of scene reconstruction. Total time of generating an assembly plan.
Process Time Task Time

First time/s Second time/s Third time/s Averagetime/s First time/ Second time/ Third time/ Averagetime/s
s s s
Photo collection 1410 1620 1290 1440
3D reconstruction 3420 3180 2760 3120 Wall model 332 312 328 324
Total time 4560 Pyramid model 365 354 371 363
Triumphal arch 401 423 416 413
model
Total time 1100
two factors: the number of assembly bricks and the complexity of the
assembly trajectory. The robotic assembly trajectory planned in this
work consisted of four “pick-lift-move-place” teaching points (Fig. 13).
manual teaching planning of the three task models.
The pick point of each brick was maintained uniformly, and the pla-
The planning efficiency for the robotic assembly was reflected in
cement point was determined through the manual observation in the
two perspectives: (i) efficiency in multitask planning; and (ii) efficiency
construction area. Table 4 shows the time (seconds) required for the
in complex task planning.

9
L. Ding, et al. Advanced Engineering Informatics 43 (2020) 100993

teaching planning gradually increased, but the time spent on task-level


planning for different complexity tasks is basically unchanged. Manual
teaching planning is time saving when planned with simple (fewer
bricks) building tasks. However, the building structure is complex and
changeable in the actual project. Therefore, task-level planning is ef-
fective.
In addition, task-level planning, which also ensures the stability of
robot construction progress and improves the management efficiency of
robotic construction, will not affect planning efficiency due to human
factors (e.g., fatigue and mal-operation during manual teaching plan-
ning).

4.3.2. Planning accuracy analysis


For neat assembly tasks, the precision of the placement point of each
brick’s 3D coordinates directly reflects planning accuracy. For manual
teaching planning, the placement point 3D coordinates were read by
the robotic instructor, while task-level planning was generated by the
BIM platform. After the planning experiments were completed, the
Fig. 13. Manual teaching planning. placement point of all bricks’ 3D coordinates were recorded. For a sa-
tisfactory analysis of planning accuracy, the author randomly selected
12 corresponding placement points in the 3D coordinates of the BIM
Table 4
Total time of manual teaching planning.
task models to compare the differences between both planning
methods. The specific values are shown in Table 5.
Task Time In a comparative experiment, the placement coordinates of manual
First Second Third AverageTime/s
teaching planning were taken as standard data. The cosine similarity
Time/s Time/s Time/s (Formula 1) between the manual teaching and task-level planning was
calculated on the basis of the 12 sets of data in Table 5. According to the
Wall model 3360 3240 3060 3220 comparison theory of cosine similarity, when the angle between two
Pyramid model 5270 5084 5060 5138
Triumphal arch 7790 7545 7580 7655
coordinate vectors is infinitely close to “1,” the similarity between the
model two sets of data is higher; the closer to “−1”, the lower the similarity.
Total time 16,013 The calculation results are shown in Table 6 and Fig. 16. The similarity
of the 12 groups of data that were compared were all close to “1,”
which reflects the accuracy of task-level planning.
In the first perspective, the author compared the total time to
complete the three planning tasks through task-level and manual Ai ·Bj
CosSim (Ai , Bj ) =
teaching planning (Fig. 14). In the planning preparation period, the ||Aj ||||Bj || (1)
method proposed in this work spent some time on image-based scene
modeling, while manual teaching planning would not take this time. The source of error in task-level planning is worth analyzing. The
However, with the superposition in the number of tasks, the efficiency first reason is the error of the image-based 3D modeling. The existing
of task-level planning gradually emerged. When all planning was image-based scene modeling technology could not accurately restore
completed, the time spent on task-level planning was less than half of the entire construction scene in 1:1, thereby resulting in a centimeter-
the manual teaching planning. The reason behind this result is that task- level error. The second reason is the calibration error of the BIM plat-
level planning automatically calculates the placement points of each form. Although the signalized (coded) targets were carried out in ad-
brick’s 3D coordinates through the BIM platform. vance, the calibration of the robot base coordinates and the calibration
Another perspective of this method that is worth comparing is the of the scene model with the BIM task models were completed through
difference in efficiency between manual teaching and task-level plan- manual operation, which is subject to human error. Improving scene
ning in easy-to-difficult building tasks. Fig. 15 shows that as the com- modeling and robotic calibration techniques are all topics that can be
plexity of the assembly task increases, the time spent on the manual further explored.

Task-level planning
Methods

Manual teaching

0 5000 10000 15000 20000


Time(s)
Preparation Wall model Stair model Pyramid model
Fig. 14. Planning time for the two methods.

10
L. Ding, et al. Advanced Engineering Informatics 43 (2020) 100993

9000
8000
7000
6000
5000
s
Time 4000
3000
2000
1000
0
Wall model Stair model Pyramid model

Building Assembly Task


Manual teaching Task-level planning
Fig. 15. Planning time for the three tasks.

Table 5 standard. As a highly anticipated tool for automated construction in the


Placement point statistics. future, robots should be an integral part of BIM design models. As
Task No. Manual teaching planning Task-level planning discussed by Dactalab, existing research has greatly focused on robotic
Ai =( xi , yi , z i ) Bj =( xj , yj , zj ) hardware and ignored software system-based construction planning
theory [13]. Robots need to embrace an automated planning approach
Wall 1 281.95, 1192.43, 1296.52 296.92, 1173.07, 1297.51 to improve their ability to adapt to a different business. Task-level
2 84.79, 1189.24, 1501.05 96.92, 1173.07, 1497.51
planning in this work attempts to fill this gap in the IFC standard for
3 −118.67, 1187.04, 1705. −103.08, 1173.07, 1697.51
4 281.39, 1197.19, 1702.30 296.92, 1173.07, 1697.51
robotic assembly tasks. This idea is brand new for manufacture and
assembly design [42]. In automatic construction, this method provides
Pyramid 5 83.19, 1093.55, 1292.20 80.82, 1075.85, 1302.25
6 −230.95, 1080.20, 1497.87 −219.18, 1075.85, 1502.25
an effective way to transform “BIM design” to “robot construction.”
7 −31.55, 895.84, 1498.27 −19.18, 875.85, 1502.25 Discussions on other applicable scenarios of task-level planning
8 85.31, 1080.41, 1698.02 80.82, 1075.85, 1702.25 were conducted in addition to some simple on-site tasks. The factory
Stair 9 124.64, 1199.97, 1301.96 111.60, 1175.57, 1327.53 manufacturing process of prefabricated structures will be the main
10 −369.53, 987.70, 1501.38 −388.40, 975.57, 1527.54 application scenario. As previously mentioned, the form of architectural
11 −365.95, 781.05, 1703.27 −388.40, 775.57, 1727.54 products tends to become increasingly personalized. The structural
12 −362.81, 787.47, 1909.48 −388.40, 775.57, 1927.54 forms in prefabricated factories, such as prefabricated walls, stairs, and
some decorative structures, are often diversified. Traditional produc-
tion lines for these components are inflexible. Thus, the advantages of
Table 6
task-level planning are reflected as follows: (i) short production sche-
Cosine similarity of manual teaching and task-level planning.
dule, (ii) reduced investment cost of the production line, and (iii) re-
No 1 2 3 4 5 6 duced labor consumption.
To obtain more personalized product forms, this method still has
CosSim 0.9992 0.9996 0.9996 0.9993 0.9992 0.9997
some limitations, which are mainly reflected in these points: (i) the
No 7 8 9 10 11 12
form of a single brick for robotic assembly is too simple, and the next
CosSim 0.9991 0.9999 0.9987 0.9989 0.9992 0.9990 step can discuss more complex component forms; (ii) constraints on the
placement of each brick only considers the degrees of freedom in the “x,
y, z” directions, thereby ignoring the effect caused by the change in
5. Discussion attitude; and (iii) the main problem to be solved is the multitask
planning of a single robot. Next, multitask planning of multiple robots
BIM is gradually forming a complete framework to contain all the can be discussed address more complex assembly problems.
information needed for construction activities based on the IFC

1
0.999
0.998
0.997
0.996
CoSim

0.995
0.994
0.993
0.992
0.991
0.99
0 1 2 3 4 5 6 7 8 9 10 11 12
Number of Assenbly conponents

Fig. 16. Error distribution of the two methods.


11
L. Ding, et al. Advanced Engineering Informatics 43 (2020) 100993

6. Conclusion Eng. 4 (1) (2017), https://doi.org/10.1080/23311916.2017.1361600.


[17] C. Zhou, L. Ding, Y. Zhou, M.J. Skibniewski, Visibility graph analysis on time series
of shield tunneling parameters based on complex network theory, Tunn. Undergr.
This work focused on developing the planning efficiency of con- Space Technol. 89 (2019) 10–24, https://doi.org/10.1016/j.tust.2019.03.019.
struction robotics in different building tasks without losing planning [18] J. Willmann, F. Augugliaro, T. Cadalbert, R. D'Andrea, F. Gramazio, M. Kohler,
accuracy. This work proposed a task-level planning method for robotic Aerial robotic construction towards a new field of architectural research, Int. J.
Arch. Comput. 10 (3) (2012) 439–459, https://doi.org/10.1260/1478-0771.10.3.
brick assembly to rapidly obtain the robotic control instructions. This 439.
work summarized the following points: (i) A general IFC model for [19] J. Werfel, K. Petersen, R. Nagpal, Designing collective behavior in a termite-inspired
robotic assembly contains all the information needed for task-level robot construction team, Science 343 (6172) (2014) 754–758, https://doi.org/10.
1126/science.1245842.
planning; (ii) BIM and image-based modeling are used to calibrate robot [20] S. Wilkinson, J. Musil, J. Dierckx, I. Gallou, X. de Kestelier, Concept Design of an
pose for the unification of the robot coordinate system, construction Outpost for Mars using Autonomous Additive Swarm Construction, ESA Acta Futura
area, and assembly task; (iii) a simple conversion process is presented to special issue, 2016.
[21] S. Wilkinson, J. Musil, J. Dierckx, I. Gallou, X. de Kestelier, Concept Design of an
convert the 3D placement point coordinates of each brick into the ro-
Outpost for Mars using Autonomous Additive Swarm Construction, ESA Acta Futura
botic control instructions. In the process of experimental verification, special issue, 2016.
task-level planning can maintain the same accuracy as the traditional [22] W.S. Yoo, H.J. Lee, D.I. Kim, K.I. Kang, H. Cho, Genetic algorithm-based steel
method but saves time when facing more complex tasks. This automatic erection planning model for a construction automation system, Autom. Constr. 24
(2012) 30–39, https://doi.org/10.1016/j.autcon.2012.02.007.
planning and programming method is worth exploring. More diverse [23] T. Bock, D. Stricker, J. Fliedner, T. Huynh, Automatic generation of the controlling-
component shapes will be rapidly assembled by robots in the future in system for a wall construction robot, Autom. Constr. 5 (1) (1996) 15–21, https://
the prefabrication field. doi.org/10.1016/0926-5805(95)00014-3.
[24] L. Ding, Y. Zhou, B. Akinci, Building Information Modeling (BIM) application fra-
mework: the process of expanding from 3D to computable nD, Autom. Constr. 46
Declaration of Competing Interest (2014) 82–93, https://doi.org/10.1016/j.autcon.2014.04.009.
[25] BuildSmart, 2019. http://www.buildingsmart-tech.org/ifc/IFC4x1/final/html/.
[26] H. Wenfa, Analysis of construction process planning based on geometric reasoning,
The authors declared that there is no conflict of interest. J. Tongji Univ. 35 (4) (2007) 566–570, https://doi.org/10.1360/jos182955.
[27] K. Kim, Y.K. Cho, Construction-specific spatial information reasoning in Building
Acknowledgment Information Models, Adv. Eng. Inf. 29 (4) (2015) 1013–1027, https://doi.org/10.
1016/j.aei.2015.08.004.
[28] J. Irizarry, E.P. Karan, Optimizing location of tower cranes on construction sites
This research is supported in part by the National Natural Science through GIS and BIM integration, J. Inform. Technol. Constr. (ITcon) 17 (23) (2012)
Foundation of China (71671072 and 71732001). This work is also 351–366, https://doi.org/10.1098/rsta.1966.0049.
[29] C.Z. Li, F. Xue, X. Li, J.K. Hong, G.Q. Shen, An Internet of Things-enabled BIM
supported by the National Key R&D Program of China (No.
platform for on-site assembly services in prefabricated construction, Autom. Constr.
2018YFB1306905). The authors thank Wuhan Metro Group Co., Ltd. 89 (2018) 146–161, https://doi.org/10.1016/j.autcon.2018.01.001.
and Shanghai Tunnel Engineering Co., Ltd. [30] J. Teizer, M. Wolf, O. Golovina, M. Perschewski, M. Propach, M. Neges, M. König,
Internet of things (IoT) for integrating environmental and localization data in
building information modeling (BIM), ISARC. In: Proceedings of the International
References Symposium on Automation and Robotics in Construction, vol. 34, Vilnius
Gediminas Technical University, Department of Construction Economics &
[1] T. Bock, The future of construction automation: technological disruption and the Property, 2017. https://doi.org/10.22260/ISARC2017/0084.
upcoming ubiquity of robotics, Autom. Constr. 59 (2015) 113–121, https://doi.org/ [31] H. Fathi, F. Dai, M. Lourakis, Automated as-built 3D reconstruction of civil infra-
10.1007/978-3-319-32552-1_57. structure using computer vision: achievements, opportunities, and challenges, Adv.
[2] S.R.B. dos Santos, D.O. Dantas, S.N. Givigi, L. Buonocore, A.A. Neto, Eng. Inf. 29 (2) (2015) 149–161, https://doi.org/10.1016/j.aei.2015.01.012.
C.L. Nascimento, A stochastic learning approach for construction of brick structures [32] M. Golparvar-Fard, F. Peña-Mora, S. Savarese, Integrated sequential as-built and as-
with a ground robot, Ifac Papers Online 50 (1) (2017) 5654–5659, https://doi.org/ planned representation with D 4 AR tools in support of decision-making tasks in the
10.1016/j.ifacol.2017.08.1114. AEC/FM industry, J. Constr. Eng. Manage. 137 (12) (2011) 1099–1116, https://doi.
[3] C. Zhou, R. Chen, J. Xu, L. Ding, H. Luo, J. Fan, et al., In-situ construction method org/10.1061/(ASCE)CO.1943-7862.0000371.
for lunar habitation: chinese super mason, Autom. Constr. 104 (2019) 66–79, [33] C.A. Quinones-Rozo, Y.M.A. Hashash, L.Y. Liu, Digital image reasoning for tracking
https://doi.org/10.1016/j.autcon.2019.03.024. excavation activities, Autom. Constr. 17 (5) (2008) 608–622, https://doi.org/10.
[4] N. Podkaminer, L.S. Peters, Construction robotics, 2015. https://construction- 1016/j.autcon.2007.10.008.
robotics.com/. [34] H. Fathi, I. Brilakis, H. Fathi, I. Brilakis, A video grammetric as-built data collection
[5] M. Pivac, Fastbrick robotics, 2016. https://www.fbr.com.au/. method for digital fabrication of sheet metal roof panels, Adv. Eng. Inform. 27 (4)
[6] M. Giftthaler, T. Sandy, Kathrin Dörfler, I. Brooks, J. Buchli, Mobile robotic fabri- (2013) 466–476, https://doi.org/10.1016/j.aei.2013.04.006.
cation at 1: 1 scale: the in situ fabricator, Constr. Robot. 1 (1–4) (2017) 3–14, [35] M. Nahangi, C.T. Haas, Automated 3D compliance checking in pipe spool fabrica-
https://doi.org/10.1007/s41693-017-0003-5. tion, Adv. Eng. Inf. 28 (4) (2014) 360–369, https://doi.org/10.1016/j.aei.2014.04.
[7] Z.X. Pan, J. Polden, N. Larkin, S. Van Duin, J. Norrish, Recent progress on pro- 001.
gramming methods for industrial robots, Rob. Comput. Integr. Manuf. 28 (2) (2012) [36] A. Dimitrov, M. Golparvar-Fard, Vision-based material recognition for automated
87–94, https://doi.org/10.1016/j.rcim.2011.08.004. monitoring of construction progress and generating building information modeling
[8] K. Harada, T. Tsuji, K. Nagata, N. Yamanobe, H. Onda, Validating an object pla- from unordered site image collections, Adv. Eng. Inf. 28 (1) (2014) 37–49, https://
cement planner for robotic pick-and-place tasks, Rob. Auton. Syst. 62 (10) (2014) doi.org/10.1016/j.aei.2013.11.002.
1463–1477, https://doi.org/10.1016/j.robot.2014.05.014. [37] C.W. Liu, T.H. Wu, M.H. Tsai, S.C. Kang, Image-based semantic construction re-
[9] C. Feng, Y. Xiao, A. Willette, W. McGee, V.R. Kamat, Vision guided autonomous construction, Autom. Constr. 90 (2018) 67–78, https://doi.org/10.1016/j.autcon.
robotic assembly and as-built scanning on unstructured construction sites, Autom. 2018.02.016.
Constr. 59 (2015) 128–138, https://doi.org/10.1016/j.aei.2013.11.002. [38] R. Zeibak-Shini, R. Sacks, L. Ma, S. Filin, Towards generation of as-damaged bim
[10] K. Dörfler, T. Sandy, M. Giftthaler, F. Gramazio, M. Kohler, J. Buchli, Mobile robotic models using laser-scanning and as-built bim: first estimate of as-damaged locations
brickwork, Robotic Fabrication in Architecture, Art and Design, 2016, pp. 204–217. of reinforced concrete frame members in masonry infill structures, Adv. Eng. Inf. 30
[11] P. Tsarouchi, S. Makris, G. Chryssolouris, Human-robot interaction review and (3) (2016) 312–326, https://doi.org/10.1016/j.aei.2016.04.001.
challenges on task planning and programming, International J. Comput. Int. Manuf. [39] L. Ding, K. Li, Y. Zhou, P.E. Love, An IFC-inspection process model for infrastructure
29 (8) (2016) 916–931, https://doi.org/10.1080/0951192X.2015.1130251. projects: enabling real-time quality monitoring and control, Autom. Constr. 84
[12] A.K. Bedaka, C.-Y. Lin, CAD-based robot path planning and simulation using OPEN (2017) 96–110, https://doi.org/10.1016/j.autcon.2017.08.029.
CASCADE, Procedia Comput. Sci. 133 (2018) 779–785, https://doi.org/10.1016/j. [40] F. Dai, M. Lu, Assessing the accuracy of applying photogrammetry to take geometric
procs.2018.07.119. measurements on building products, J. Constr. Eng. Manage.-Asce 136 (2) (2010)
[13] O. Davtalab, A. Kazemian, B. Khoshnevis, Perspectives on a BIM-integrated soft- 242–250, https://doi.org/10.1061/(Asce)Co.1943-7862.0000114.
ware platform for robotic construction through Contour Crafting, Autom. Constr. 89 [41] S. Makris, P. Tsarouchi, D. Surdilovic, J. Kruger, Intuitive dual arm robot pro-
(2018) 13–23, https://doi.org/10.1016/j.autcon.2018.01.006. gramming for assembly operations, Cirp Ann.-Manuf. Technol. 63 (1) (2014) 13–16,
[14] M. Bennulf, B. Svensson, F. Danielsson, Verification and deployment of auto- https://doi.org/10.1016/j.cirp.2014.03.017.
matically generated robot programs used in prefabrication of house walls, Procedia [42] Z. Yuan, C. Sun, Y. Wang, Design for manufacture and assembly-oriented para-
CIRP 72 (2018) 272–276, https://doi.org/10.1016/j.procir.2018.03.025. metric design of prefabricated buildings, Autom. Constr. 88 (2018) 13–22, https://
[15] T. Bock, T. Linner, Robot Oriented Design, Cambridge University Press, 2015. doi.org/10.1016/j.autcon.2017.12.021.
[16] Z. Dakhli, Z. Lafhaj, Robotic mechanical design for brick-laying automation, Cogent

12

You might also like