You are on page 1of 11

Automation in Construction 107 (2019) 102912

Contents lists available at ScienceDirect

Automation in Construction
journal homepage: www.elsevier.com/locate/autcon

Mobile robot for marking free access floors at construction sites T


Takehiro Tsuruta , Kazuyuki Miura, Mikita Miyaguchi

Research and Development Institute, Takenaka Corporation, Japan

ARTICLE INFO ABSTRACT

Keywords: Improving of the productivity of construction work is an urgent task, because the shortage of construction
Marking workers in Japan has become a major social issue. Marking work for building materials that is currently per-
Mobile robot formed manually is an indispensable part of the construction process. Its possible automation would allow
Omnidirectional vehicle construction workers to focus on the installation step and thus significantly boost their productivity. In this study
Tracking
an automated mobile robotic system for marking free access floors has been developed. Building such floors
Free access floor
Construction site
requires drawing grid-pattern lines whose intersections (representing the positions of future pedestal bases) can
be automatically marked by the proposed system consisting of a mobile robot with a marking device at its center
and a laser positioning unit. Cross marks are drawn on the floor by controlling the marking device, while the
laser positioning unit accurately moves a line laser along a designated direction and measures the distance to a
projected object. The marking robot tracks the line laser and arrives at a designated point. The deviation be-
tween the designated position and the robot center is calculated, and the cross mark is drawn exactly at the
former location. The marking performance of the developed robotic system has been evaluated by conducting an
experiments inside a 6 m × 6.5 m area at a construction site. It is found that the average deviation between the
marked and designated positions is 2.3 mm, while 77% of all deviations are smaller than 3 mm.

1. Introduction In this study a fully automated mobile robotic system for marking
free access floors is proposed. To build such floors correctly, grid-pat-
The shortage of construction workers in Japan has become a major tern lines should be drawn on the floor surface before installation. The
social issue (their number in 2015 was approximately 73% of the peak intersections of these lines automatically drawn by the developed
number recorded in 1997). Therefore, improving the productivity of marking robot (guided to target positions by a laser marker) denote the
construction work is a pressing issue. Marking work is indispensable in positions for installing floor pedestal bases. The three-dimensional (3D)
the construction process. Before installing building components (such measuring instrument determines the robot position accurately, which
as dry walls, free access floors and anchor bolts) construction workers is subsequently used to draw cross marks. After that, the robot moves to
must first draw marks and lines on the floors, walls, ceilings, and other the next target position and repeats the entire operation. Because the
places at the construction site, which indicate their future positions. marking process is fully automated, providing all coordinates of the
During the construction of big buildings, more than a thousands of such target positions to this system is sufficient for its successful completion.
marks and lines should be drawn manually, which requires high effi-
ciency and automation level. 2. Related literature studies
The progress achieved by a surveying technology in recent years is
remarkable. Some measuring instruments such as the laser scanner and 2.1. Robot navigation
total station are widely used at construction sites. Prior studies focused
on improving the efficiency of the marking work by these instruments When an automated construction robot performs a task at a con-
can be classified into the support technology aimed at increasing the struction site, it has to move to its working location. After completion, it
effectiveness of the manual marking process [1–3] and those on auto- either returns or moves to the next job. Therefore proper navigation of
mated marking robots [4–11]. If the marking work is fully automated the construction robot within the working area is extremely important.
using these robots, construction workers will be able to concentrate on Odometry (dead reckoning) is one of the best approaches to achieve this
installing building parts, thus increasing their productivity. goal in terms of practicality, price, and performance. However, this


Corresponding author.
E-mail addresses: tsuruta.takehiro@takenaka.co.jp (T. Tsuruta), miura.kazuyuki@takenaka.co.jp (K. Miura), miyaguchi.mikita@takenaka.co.jp (M. Miyaguchi).

https://doi.org/10.1016/j.autcon.2019.102912
Received 13 January 2019; Received in revised form 18 June 2019; Accepted 18 July 2019
Available online 01 August 2019
0926-5805/ © 2019 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license
(http://creativecommons.org/licenses/BY-NC-ND/4.0/).
T. Tsuruta, et al. Automation in Construction 107 (2019) 102912

technique is susceptible to errors due to wheel slippage, tire wear, and


poor floor/surface quality. For these reasons, odometry is an un-
acceptable method of navigation through construction sites [12]. Beli-
veau et al. [13] reported autonomous vehicle navigation via real-time
3D laser-based positioning, in which laser transmitters were placed at
known locations and their orientations were determined using an op-
tical receiver installed on the vehicle. The average navigation error was
about 10 cm, and the measurement accuracy was substantially in-
creased by using more than two transmitters. Fukase et al. [14,15]
performed automatic guided vehicle navigation, which used a dotted
pattern on the floor. Although this method could determine the self-
position with a 1-mm accuracy, it was not suitable for construction sites
because a special floor with random dot patterns had to be constructed
in advance. Simultaneous localization and mapping (SLAM) utilized for
constructing and/or updating the map of the environment while
tracking the robot movement [16–19] is one of the most commonly
used localization methods. By applying the SLAM and adaptive Monte
Fig. 1. Outline of the marking flow using marking robot system.
Carlo localization (AMCL) [20] algorithms, mobile robots are able to
localize themselves in a rough-scale and navigate in the environment.
However, the accuracy of self-localization in this technique is less than
several centimeters [21], which is far from the precision required for operator to move the robot to a marking position using a remote con-
the layout marking at construction sites. Zhang et al. [22] developed a troller. The previous studies mentioned above [7–11] need a total sta-
construction robot that was capable of switching the vision control from tion that can collimate and track prism. However, such instrument is
the SLAM and AMCL when approaching the working location for higher generally very expensive and incurs high system costs. Hence, the de-
accuracy. velopment of a new robotic marking system without expensive instru-
ments is desirable for expanding its application scope.
2.2. Mobile marking robot system

Several studies on automated mobile marking robots have been 3. Composition of the marking robotic system
performed previously. The system constructed by Tanaka et al. was able
to autonomously move to a target point and draw specified figures on a 3.1. General outline
ceiling board [4]. Its accuracy was 10 mm, and the time required for
drawing a single point was about 8 min. The developed robot had to In the proposed system composed of a laser positioning unit (LPU)
detect the edges of three pillars at a construction site by using a laser and a marking robot (MR), target marking positions are assumed to be
range finder (LRF). If the distance from the robot to a pillar was too set in advance on the floor plan (its general outline is shown in Fig. 1).
large or the pillar was covered with a fire resistant material, its marking Here, the marking flow is divided into the MR guidance phase and
accuracy was low. Jensfelt et al. [5] developed a mobile robotic system cross-mark drawing phase. In the former phase, the MR roughly moves
for automatically marking the positions of stands for a trade fair or around the target position, while in the latter phase, the MR position is
exhibition. Their system used the information obtained from the ex- determined with high accuracy, and a cross-mark is drawn on the floor.
isting computer aided design (CAD) model of the environment and a These two steps are repeated automatically.
two dimensional laser scanner to localize the robot and was able to In the MR guidance phase, the MR travels from the current position,
mark 492 points in the area of 10,150 m2. The total time required for A(ra,θa), to the next designated position B (rd,θd) set in advance. First,
performing this task was about 4.5 h. The average absolute error was the LPU directs the measuring laser parallel to the floor toward the
28 mm, and the system performance in terms of the marking area and position B. Second, the LPU turns the line laser at angle θd. The MR base
speed was excellent. However, the accuracy of this system was in- includes several cameras and screens. The cameras capture the screen
adequate for some types of layout marking at construction sites. Abidin image, onto which the laser spot and line are projected. The MR tracks
et al. [6] also developed an autonomous mobile robotic system to the moving laser line and travels along the circumference of the circle
perform floor marking for a trade fair or exhibition, which was based on having the LPU at its center while maintaining the screen orientation
the image captured by the camera mounted on the top of the area to be perpendicular to the direction of the line laser. When the angle of the
marked. The marking accuracy of this system was about 10 mm. line laser reaches θd, the LPU halts the line laser movement.
However, its camera had to be installed on the ceiling and calibrated Subsequently the MR moves backward or forward along the straight
depending on the installation position, which was very laborious and line connecting the LPU and the position B toward the latter while
costly. Moreover, the marking area in that case was limited by the vi- maintaining the screen orientation perpendicular to the line laser.
sual field of the camera. Inoue et al. reported a system using a total When the difference between the measured distance and the designated
station and an LRF to determine the robot position [7–9]. Its marking distance rd become smaller than the threshold, the MR stops. The MR
accuracy was within about 2 mm, and the time required for marking arrival position does not exactly correspond to the designated position,
after the robot reached the target area was about 2 min. In this system, and the position of the MR center is calculated from the combination of
the attitude angle of the mobile robot could not be measured during the LPU with the screen image.
moving. No navigation method for moving from one mark point to In the cross-mark drawing phase, the MR draws a cross mark exactly
another point has been reported previously. The “Laybot” developed by at the designated position using a special marking device that can move
DPR Construction [10] was capable of drawing lines on the slabs for the marking pen in the x, y, and vertical directions. The movement
drywall layouts at a speed from 300 to 400 ft per hour while running. Its value of this pen is calculated by comparing the position B with the
marking accuracy was within an eighth of an inch. Kitahara et al. re- current position of the MR center. By controlling the marking device
ported a robotic system, which could draw line segments on the floors based on the calculated movement value, a cross mark with the center
and walls of construction sites with an accuracy of 1 mm or smaller corresponding to the position B is drawn on the floor.
[11]. However, their system was not automated and required an

2
T. Tsuruta, et al. Automation in Construction 107 (2019) 102912

measuring instrument (3D-MI: Leica Geosystems 3D Disto). According


to the technical specifications provided by Leica Geosystems, the ac-
curacy of angle measurements is 5 s and the accuracy of the distance
between two points 10 m apart is approximately 1 mm. Similar to the
total station, the horizontal and vertical axes of the 3D-MI can be mo-
torized. In addition, it can direct the measuring laser accurately in the
direction specified in advance and measure the distance to the object,
onto which a laser spot is projected. The GLU consists of a laser marker
and a rotation stage. The laser marker placed on a rotation stage can
generate a vertical laser line. Construction workers often use laser
marker for drawing a reference line on the floor, walls and ceiling. The
vertical axis of the 3D-MI matches that of the rotation stage. This unit is
also motorized and its rotational speed is controlled through a RS485
interface. A photograph of the LPU is shown in Fig. 4.

3.4. System diagram

Fig. 2. A kinematic model of the omnidirectional vehicle. Fig. 5 summarizes the relationship existing between different
hardware modules of the MR and LPU components. The software used
3.2. Marking robot for their communication and integration is based on the Robot Oper-
ating System (ROS) [25]. Here, computer A of the MR conducts three
In the developed system, an omnidirectional vehicle (ODV) is used main tasks. First, it performs image processing for detecting the laser
as the movement vehicle. It has three omnidirectional wheels mounted position projected on the screen. Second, it sends commands to the
symmetrically at angle of 120 degrees. Each wheel is driven by a DC motor driver for the ODV and the XY movement stage. Third, it ex-
motor and located at the same distance from the robot center. Unlike changes various parameters such as the measured distance, rotation
the non-holonomic robots, this type of vehicle has a full mobility in the angle of the 3D-MI and target position with computer B for controlling
floor plane indicating that at each instant it can move in any direction the LPU through Wi-Fi connection. The ODV motor driver controls the
without any reorientation [23]. The rotational speed of each wheel can motors for all omni-wheels, and the motor driver for the XY movement
be controlled independently by sending commands to the motor driver. stage controls the motors for the X, Y and Z axes. Computer B controls
The kinematic model of the MR is shown in Fig. 2. For the convenience both 3D-MI and rotation stage; it sends some commands (such as those
of description, three coordinate systems are used in this work: ΣW, the directing the measuring laser along a designated direction and mea-
world coordinate system; ΣR, the robot coordinate system; and ΣL, the suring the distance). It also sends commands to specify the rotation
laser coordinate system, in which the Yl axis corresponds to the direc- angle and speed of the rotation stage.
tion of the line laser, and the origins Ow and Ol are identical. Here ωi
(i = 1,2,3) denotes the rotational velocities of the wheels (i represents 4. Controlling of the MR and the LPU components
the wheel number), Rw is the radius of the wheel, and L is the distance
from the robot center to the wheel. The parameters xl, yl, and θl re- 4.1. Measuring the MR position
present the positions and attitude angle in ΣL. The inverse-kinematic
model in ΣL is described by Eq. (1) [24]. The dots over the xl, yl, and θl The MR position is calculated from the positions of the laser line and
indicate derivatives of these variables with respect to time. This equa- laser spot projected onto the screens and distance from the LPU. As
tion is used to calculate ωi (i = 1,2,3) for the MR motion control as shown in Fig. 6, the attitude angle θl in ΣL is computed using Eq. (2), in
described in Section 4.3. which, parameters d1 and d2 are the x-coordinates of the laser line
projected on each screen in ΣR. In the cross mark drawing phase, a laser
1
2 spot is projected on screen 1. Variables Δxr and Δyr, which represent the
3 movement values of the pen unit in the x and y-axis directions in ΣR
1 respectively, are calculated via Eqs. (3) and (4), respectively, using the
= [ cos sin l L cos( + 2 /3) sin( + 2 /3) L
Rw
l l l
geometric relationship depicted in Fig. 7. Here, dspot is the x-coordinate
xl of the laser spot in ΣR, rscreen is the distance from the LPU to screen 1,
and θs is the attitude angle of the MR with respect to the direction of the
cos( + 4 /3) sin( + 4 /3) L] yl
l l
measuring laser. The laser spot does not necessarily overlap with the
l (1) laser line because of the rotation stage control error. In the proposed
system, a servomotor is used for the LPU rotation stage. According to its
A photograph of the MR is shown in Fig. 3. Here, two screens and
technical specifications, the backlash is smaller than 0.5 degrees. θl is
cameras are placed on the metal frames installed on the ODV base.
used instead of θs in Eqs. (3) and (4). The MR is programmed to stop
Screens 1 and 2 are oriented parallel to each other and perpendicularly
when rscreen – rd < 30 mm. Thus, the maximum error of the Δxr and Δyr
to the floor. The distance between them is ys. Cameras 1 and 2 face the
due to the usage of θl instead of θs estimated at θs are 0 degree and 90
screens 1 and 2, respectively, and capture the corresponding images of
degrees is 0.26 mm. By controlling the marking device based on the
the projected laser spot and line. The marking device is installed at
values calculated by Eqs. (3) and (4), a cross mark is drawn on the
robot center; it consists of the XY movement stage and the pen unit,
designated position with high accuracy.
which can move the marking pen in the vertical direction. The position
of the pen unit can be adjusted using the slider rails. By controlling the d2 d1
tan =
marking device, the cross mark is drawn on the floor. l
ys (2)

3.3. Laser positioning unit x r = dspot + (rscreen rd) sin s (3)

This unit is composed of a guide laser unit (GLU) and a 3D yr = (rscreen rd ) cos s (4)

3
T. Tsuruta, et al. Automation in Construction 107 (2019) 102912

Fig. 3. Photographs of the marking robot.

Fig. 6. Screen layout and laser line projected on the screen.


Fig. 4. A photograph of the laser-positioning unit.

Fig. 5. A diagram describing different modules of the LPU and the MR component.

4
T. Tsuruta, et al. Automation in Construction 107 (2019) 102912

the index of the known point).


Eight unknown parameters a, b, c, d, e, f, g and h are obtained by
solving the Eq. (5) substituting a combination of more than four known
Designated points (ui,j, vi,j) and (si,j, ti,j). In the proposed system, a calibration mark
point is drawn on the screen. Four black squares are arranged at the four
corners of the rectangle with width W and height H. Fig. 8b and c show
the screen images obtained before and after the homographic trans-
formation with the four known points (ui,j, vi,j) and (si,j, ti,j). The posi-
tions of the laser line and laser spot are determined from this distortion-
free image. First, the red, green and blue channels are extracted sepa-
rd rately. In each channel image, the average value of the pixel intensities
rscreen of each column is calculated as shown in Fig. 9a, b, and c. Because the
proposed system uses a green line laser, the degree of the green value
can be calculated by subtracting the average value of the red and blue
channel components from the green channel component, as shown in
Fig. 9d. The x-coordinate of the laser line dj in ΣR, is calculated via the
following equation:

Fig. 7. Screen layout and laser spot projected on the screen. umax, j
dj = 1/2 W
NH (6)

4.2. Detection of the laser line and laser spot where, umax,j (j is the screen number) is the horizontal camera co-
ordinate with the largest pixel intensity; NH is the number of horizontal
It is difficult to determine the laser position accurately from a pixels; and W is the screen width (see Fig. 9c).
captured image without image preprocessing because the captured The laser spot is red (see Fig. 10a). A binarized image is obtained
image has two types of distortions that must be removed; lens distortion from the red channel image utilizing a binarization threshold, whose
and perspective distortion. The former can be eliminated by using value is set to 250 in this study. The noise level in the binarized image is
Zhang's method [26], and Fig. 8a and b show the screen images cap- reduced by three times erosion processing with a 1 × 3 kernel. The
tured before and after the lens distortion removal. The perspective parameters of image processing are empirically determined and used in
distortion is removed by the homographic transformation based on the all experiments performed in this work. As shown in Fig. 10c, the laser
following formulas: line is removed after image processing leaving only the laser spot intact.
From this image, the area centroid pixel of the detected spot is de-
a + bsi, j + cti, j termined. After that, as shown in Fig. 10d, a masked image is obtained
u i, j =
1 + gsj + ht j by setting the pixel intensity to zero except for the square region of
d + esi, j + fti, j 50 × 50 pixels centered on the area centroid pixel in the red channel
vi, j =
1 + gsi, j + hti, j (5) image. The horizontal camera coordinate of the laser spot, umax,spot, is
calculated as the brightness center of gravity in this masked image. The
where.u i,j,v i,j: camera coordinates after the transformation. (i is the x-coordinate of the laser spot in ΣR, dspot, is also determined by Eq. (6).
screen number and j is the index of the known point).si,j,t i,j: camera The image processing procedure described above is implemented by
coordinates before the transformation. (i is the screen number and j is OpenCV and conducted by the computer installed on the MR. The

(si,1,ti,1) (si,2,ti,2)

(si,3,ti,3)
(si,4,ti,4)

a. Captured image b. Image after distortion removal


W u
(ui,1,vi,1) (ui,2,vi,2)

(ui,3,vi,3) (ui,4,vi,4)

v
c. Image after homographic transformation
Fig. 8. Images obtained before and after distortion removal.

5
T. Tsuruta, et al. Automation in Construction 107 (2019) 102912

Fig. 9. Average pixel intensities.

calculation cycle depends on the computer performance. In the pro- rotational speed of each wheel is determined by Eq. (1).
posed system, it is approximately 0.17 s.

4.3.2. Traveling along the circular path


4.3. Motion control As explained in Section 3.1, when the line laser turns, the MR travels
along the circular path while maintaining the screen position perpen-
4.3.1. Control system dicular to the line laser. In the experimental setup depicted in Fig. 12,
In the proposed system, the MR tracks the line laser. The block the tracking performance on the path is evaluated by changing the
diagram for the tracking process is shown in Fig. 11, and the meaning of rotational speed of the rotation stage to 0.3, 0.6 and 0.9 degrees per
each symbol is explained in Table 1. This proportional-integral (PI) second. The reference x-coordinate in ΣL, attitude and translational
control system feedback controls the rotational speed based on the velocity for the y-axis direction in ΣL and proportional and integral
tracking error (ex,eθ). The condition (xl,r,θl,r) = (0,0) indicates that the gains are set to the values listed in Table 2. Because yl, d was set to zero,
laser line is located at the center of the screen and oriented perpendi- the MR tracked the moving line laser while keeping the distance be-
cularly to it. By converging the tracking error to zero, the MR tracks the tween the MR and the LPU constant. The proportional and integral
laser line while keeping the screen orientation perpendicular to the line gains were preliminary calculated by the simple simulation modeling of
laser. The magnitudes of ex and eθ are identical to the values of dj and θl the control system with the Scicos software [26]. The final parameter
calculated by Eqs. (6) and (2), respectively. The values of xl, d and l, d listed in Table 2 was determined by adjusting the parameter of the trial-
are obtained by using the PI control system. The direction value of the and-error experiment described in Fig. 12. Fig. 13 shows the obtained

Fig. 10. Laser spot images obtained before and after image processing.

6
T. Tsuruta, et al. Automation in Construction 107 (2019) 102912

Fig. 11. Schematic of the control system.

Table 1 Table 2
Description of the parameters used in Fig. 11. Values of some parameters used in Fig. 11.
Parameters Details Parameter Value

(xl, yl, θl) The coordinate and attitude angle of the MR in ΣL xl, r 0
(xl, r, θl, r) The reference x-coordinate and attitude angle of the MR in ΣL θl, r 0
(ex, eθ) The deviation between the reference and measured value yl,d 0 or 50a
(x l, d , yl, d , l, d ) The direction value of the translational and rotational speed of Kp,x, Kp,θ, KI,x, KI,θ 20, 0.244, 4, 0
the MR in ΣL
(ω1, ω2, ω3) The direction value of the rotational speed of each wheel a
The values of 0 and 50 are used for traveling along the
(x l , yl , l)
The translational and rotational speed of the MR in ΣL circular and straight paths, respectively.
s Differential operator

0.5 degrees respectively.


time dependences of xl and θl, which vary within 130 mm and 4.0 de-
grees at a speed of 0.9 deg./s, respectively. Because W = 270 mm, in 5. Experiment
the proposed system, the laser line did not deviate from the screen. The
translational speed of the laser line projected on the screen increases as 5.1. Laboratory studies
the distance between the MR and LPU increases. Therefore, the rota-
tional speed of the rotation stage can vary based on the distance from In order to evaluate the performance of the developed marking
the LPU to the MR to keep the laser line within the screen boundaries robotic system, a basic test was conducted in our laboratory parking
(its magnitude is inversely proportional to this distance). area to integrate the elements described in Sections 4.1, 4.2 and 4.3 (its
layout is shown in Fig. 16). Two benchmarks were placed at a distance
of 4 m from each other, and one of them was selected as the origin. The
4.3.3. Traveling along the straight line
line connecting these two benchmarks was defined as the x-axis. In the
When the line laser stops turning, the MR travels along the straight
4 m × 10 m test area, 50 positions corresponding to the grid intersec-
line connecting the LPU and the next designated position while keeping
tions with intervals of 1 m were designated for marking. The MR was
the screen orientation perpendicular to the line laser. In the experi-
able to move both along the circular and straight paths between the two
mental setup depicted in Fig. 14, the tracking performance during such
adjacent positions. When the MR reached a designated position, it drew
a travel is evaluated (its setting parameters are listed in Table 2). Except
a cross-mark on the floor, after which the entire cycle was repeated. The
for yl, d , the values of these parameter are identical to those used for the
MR marked from position A0 to position J0 in ascending order. The
travel along the circular path. Here, the MR travel speed is 50 mm/s.
moving path from position A0 to position B0 is shown as an example in
Fig. 15 shows the obtained dependence of xl and θl on yl. It was found
Fig. 16 (here, the red and green lines indicate straight and circular
that the MR stably traveled along the straight line until the yl value was
paths, respectively). After marking all positions, the coordinates of the
26,896 mm by using the described motion control method. When yl
centers of all cross marks were measured by using the 3D-MI.
exceeded that magnitude, the position of the line laser could not be
Fig. 17 shows the frequency distributions of xl and θl in the MR
detected by the method described in Section 4.2 because the MR was
guidance phase obtained by measuring xl and θl values every 0.17 s. The
located too far from the LPU. The absolute values of xl and θl were <
resulting numbers grouped near zero. Their average values were
25 mm and 5 degrees, and their standard deviation were 2.0 mm and
0.9 mm and 0.03 degrees, and the corresponding standard deviation
were equal to 26.8 mm and 1.2 degrees, respectively. These results
show that the MR stably tracked the line laser while controlling its
orientation so that the laser line was focused on the screen center
perpendicularly to its surface.
Fig. 18a and b show the frequency distributions of Δx and Δy, which
represent the deviations in the x- and y-coordinate between the desig-
nated and marked positions, respectively. The results of the marking
Fig. 12. Experimental setup for the traveling along the circular path. accuracy measurements are listed in Table 3. The average and

7
T. Tsuruta, et al. Automation in Construction 107 (2019) 102912

40 10
20 8
0 6
-20 4
2
x l[mm]

θ l[deg]
-40
0
-60
-2
-80 -4
0.3deg/sec 0.3deg/sec
-100 -6
0.6deg/sec 0.6deg/sec
-120 -8
0.9deg/sec 0.9deg/sec
-140 -10
0 10 20 30 40 50 0 10 20 30 40 50
Time[sec] Time[sec]

Fig. 13. Tracking performance evaluated during traveling along the circular path.

6. Discussion

6.1. Marking accuracy

The experimental data obtained at the construction site revealed


that
Fig. 14. Experimental setup for traveling along the straight path.
1. Deviation larger than 10 mm was observed, and
2. The average deviation in the x-coordinate was larger than the value
maximum deviations were 1.5 mm and 4.2 mm, respectively, while measured in the laboratory.
94% of the measured values had deviations < 3 mm. No significant
correlation between the distance from the LPU to the MR and the de- The first problem was caused by the detection failure of the line
viation was observed. The relatively large deviations are likely caused laser. As shown in Fig. 19, the construction site contained many win-
by the local unevenness of the floor surface. It took 101 min to mark all dows. The sunlight passing through these windows inferred with the
50 designated points, and the average time required for marking one detection of the line laser position, which led to miscalculations of the
point was 122 s. Thus, we confirmed accurate and automatic marking robot position and orientation.
was achieved in the proposed system. The second problem was mainly caused by the alignment error of
the marking pen, which should be located at the robot center. Further
accuracy improvement can be achieved by correcting the pen move-
5.2. Testing at a construction site ment distance based on this alignment error. Another reason for the
deviation is local unevenness of the floor at the construction site. In this
After the proposed marking system demonstrated satisfactory per- construction site, there were some local unevenness that can be con-
formance in the laboratory test, it was tested at a construction site (the firmed visually. In such positions, the deviations were larger than other
utilized experimental layout is shown in Fig. 19). The setting of the positions. In previous studies, it was reported that the inclination of the
coordinate system is similar to that described in the previous section. robot caused by the floor slope or unevenness affected the marking
Inside the 6 m × 6.5 m test area, 182 positions representing grid in- accuracy, and that the marking error was proportional to the height of a
tersections with intervals of 0.5 m were designated for marking. These measuring laser [7]. Hence, this error can be reduced by lowering the
intersections are the positions, at which construction workers install the height of the screen.
pedestal bases of free access floors. The marks were labeled from po-
sition A0 to position N0 in the ascending order. Fig. 20a and b show the
frequency distributions of Δx and Δy respectively, and the results of the 6.2. Marking time
marking accuracy measurements and time required for marking are
listed in Table 4. The average and maximum deviations were 2.3 mm The average time required for marking one point was shorter than
and 10.2 mm, respectively, and 77% of the measured values had de- that obtained in previous work [9]. According to the information pro-
viations < 3 mm. It took 5 h to mark all 182 designated points, and the vided by construction workers, two person can mark a free access floor
average time required for marking one point was 98 s, which was at a speed of around 1000 points per hour, which is 27 times greater
smaller than the value obtained in the previous experiment due to the than that of the proposed system. Note that marking free access floors is
smaller distance between the adjacent designated positons. relatively easy because pedestals are installed in a regular grid pattern.

50 10
25 σ=2.0mm 5 σ=0.5deg
x l[deg]

θ l[deg]

0 0
-25 -5
-50 -10
0 5000 10000 15000 20000 25000 30000 0 5000 10000 15000 20000 25000 30000
yl[mm] yl[mm]
a. Relationship between xl and yl b. Relationship between θl and yl
Fig. 15. Tracking performance evaluated during traveling along the straight line path.

8
T. Tsuruta, et al. Automation in Construction 107 (2019) 102912

Fig. 16. Experimental layout used during laboratory testing.

1 1 Table 3
Frequency distribution

Frequency distribution

Ave: 0.9mm Ave: 0.03deg Results of the marking accuracy measurements conducted during laboratory
0.8 0.8
σ=1.2deg testing.
0.6
σ=26.8mm 0.6
Parameters Δx Δy x2 + y2
0.4 0.4
Average deviation(mm) 0.7 −0.6 1.5
0.2 0.2
Standard deviation (mm) 1.1 0.9 0.9
0 0 Maximum deviation (mm) 3.9 −2.6 4.2
-100 0 100 -10 0 10 Fraction of deviations < 3 mm (%) 96 100 94
xl[mm] θl[deg]
a. Frequency distribution of xl b. Frequency distribution of θl
other type utilizes external measuring instrument (such as 3D ones).
Fig. 17. The performance characteristics of the line laser tracking process. The system described in Ref. [5] belongs to the first category. Although
it moves and draws extremely fast, its accuracy is > 10 times worse
Although it is difficult to achieve the speed of the construction workers, than those of the other systems because its localization procedure
our goal is to be able to mark 100 points per hour. In the case of dry mainly utilizes inner sensors, which are not suitable for marking at
partition walls, the marking pattern is more complicated than that used construction sites. In the second category, only the system developed in
for free access floors. Therefore, we expect that marking such walls by this study uses a reflectorless 3D measuring instrument. Although this
the automated robot system would be more effective. instrument cannot track and collimate prism target, it is less expensive
than the instrument that can. In the system reported in Ref. [11], an
operator of the system has to move a robot to the marking points with
6.3. Comparison to the result of previous studies
operation of a remote controller. The time required for marking a single
position reported in Refs. [7–9] is 2 min (it excludes the time of moving
The accuracy, time required to mark a single position, and other
from the former position to the next one). Meanwhile, the system de-
parameters of the proposed system are compared with those of the
veloped in this work is fully automated and can mark a single position
previously reported marking systems [5,7–11] in Table 5. These sys-
in 98 s including the time for moving to the next position, while its size
tems can be roughly divided into two types. The first type includes the
and weight of proposed system are smaller than those of the systems
system using only inner sensors (such as the laser range finder), and the

∆xave=0.7mm ∆yave= -0.6mm


25 25

20 20

15 15
frequency

frequency

10 10

5 5

0 0

∆ x[mm] ∆ y[mm]
a. Frequency distribution of ∆x b. Frequency distribution of ∆y
Fig. 18. Frequency distributions of Δx and Δy obtained during laboratory testing.

9
T. Tsuruta, et al. Automation in Construction 107 (2019) 102912

Fig. 19. Experimental setup used during testing at the construction site.

described previously. Thus, except for the accuracy, the developed Table 4
robot is superior to the other reported systems. Results of the marking accuracy measurements conducted at the construction
site.
Parameter Δx Δy x2 + y2
7. Conclusions and future research
Average deviation(mm) 1.7 0 2.3
From the results obtained in this work, the following conclusions Standard deviation (mm) 1.9 1.4 1.6
can be drawn. Maximum deviation (mm) 9.5 4.5 10.2
Fraction of deviations < 3 mm (%) 87 97 77
1. Automated marking of the 6 m × 6.5 m area of the free access floor
at a construction site was successfully performed.
2. The average deviation between the marks drawn by the MR and the significantly affected by the sunlight, which represents a major problem
designated positions was 2.3 mm, and 77% of all deviations were for automated marking during daytime. To solve this problem, a new
below 3 mm. MR guidance method without a line laser must be developed.
3. The average marking time was equal to 98 s, which was shorter than When the MR of the proposed system was occluded by an obstacle
that of the previously reported system using 3D measuring instru- (such as a column or a wall), its position could not be determined by the
ment. 3D-MI unit. Thus, the developed system may be effectively used in non-
occluded area, which is also a limitation of the systems described in
However there are two major limitations in this study that should be previous works [7–11]. A possible solution to this problem is the si-
addressed in future research. When the proposed marking system was multaneous operation of multiple systems. Because the proposed ro-
tested outside the 6 m × 6.5 m area, the MR guidance procedure failed botic marking system contains an inexpensive reflectorless 3D mea-
due to the sunlight interference (see Section 6.1). At the same time, the suring instrument, its operation cost may be significantly lower than
laser spot of the measuring laser could be detected. These results sug- those of the other systems.
gest that the MR guidance method by using the line laser has been The experiment conducted at the construction site revealed the

∆xave=1.7mm ∆yave=0mm
60 60

50 50

40 40
frequency

frequency

30 30

20 20

10 10

0 0

∆ x[mm] ∆ y[mm]
a. Frequency distribution of ∆x b. Frequency distribution of ∆y
Fig. 20. Frequency distributions of Δx and Δy obtained during test at the construction site.

10
T. Tsuruta, et al. Automation in Construction 107 (2019) 102912

Table 5
Performance parameters of the marking robotic system.
Parameter Proposed system System reported in [5] System reported in [7–9] System reported in [11]

Accuracy 2.3 mm on average 28 mm on average Less than Less than


2 mm 1 mm
a
Time required for marking a single position 98 s 33 s More than Unknown
2 minb
Use of 3D measuring instrument Yesc No Yesd Yesd
Automated or not Automated Automated Automated Not automated
Size 500×500×400 Unknown 700×700×800 mm 1500 × 1000
mm ×400 mm
Weight 17 kg Unknown Unknown 56 kg

a
: The distance between the adjacent marking positions is 50 cm
b
: Excluding the movement to the marking position
c
: 3D measuring instrument that cannot collimate and track prism
d
3D measuring instrument that can collimate and track prism.

limitations of the developed system. Currently we are developing a new dimensional measuring instruments, Proceedings of the 35th International
robotic system for solving the first limitation. We want to use the robot Symposium on Automation and Robotics in Construction, 2018, pp. 292–299, ,
https://doi.org/10.22260/ISARC2018/0042 Berlin, Germany.
guidance technology developed in this study not only for the marking [12] M. Lindholm, S. Pieska, L. Koskela, A 3-D measurements in construction robot-
work but also that for the installing building materials in the future. ization, Proceedings of the 7th International Symposium on Automation and
Robotics in Construction, 1990, pp. 126–133, , https://doi.org/10.22260/
ISARC1990/0017 Bristol, United Kingdom.
References [13] Y.J. Beliveau, J.E. Fithian, M.P. Deisenroth, Autonomous vehicle navigation with
real-time 3D laser based positioning for construction, Autom. Constr. 5 (1996)
[1] S. Sakamoto, H. Kishimoto, K. Tanaka, Y. Maeda, 3D measuring and marking system 261–272, https://doi.org/10.1016/S0926-5805(96)00140-9.
for building equipment: developing and evaluating prototypes, Proceedings of the [14] Y. Fukase, H. Kanamori, S. Kimura, Self-localization system for robots using random
26th International Symposium on Automation and Robotics in Construction, 2009, dot floor patterns, Proceedings of the 30th International Symposium on Automation
pp. 131–138, , https://doi.org/10.22260/ISARC2009/0001 Austin, USA. and Robotics in Construction, 2013, pp. 304–312, , https://doi.org/10.22260/
[2] S. Sakamoto, N. Kano, T. Igarashi, H. Kishimoto, H. Fujii, Y. Oosawa, K. Minami, ISARC2013/0033 Montréal, Canada.
K. Ishida, Laser marking system based on 3D CAD model, Proceedings of the 28th [15] Yutaro Fukase, Hiroshi Kanamori, Hiroshi Taga, Application of self-location system
International Symposium on Automation and Robotics in Construction, 2011, pp. using a floor of random dot pattern to an automatic guided vehicle, Proceedings of
64–69, , https://doi.org/10.22260/ISARC2011/0009 Seoul, Korea. the 33rd International Symposium on Automation and Robotics in Construction,
[3] S. Sakamoto, N. Kano, T. Igarashi, H. Tomita, Laser positioning system using RFID- 2016, pp. 42–49, , https://doi.org/10.22260/ISARC2016/0006 Auburn, USA.
tags, Proceedings of the 29th International Symposium on Automation and Robotics [16] T. Bailey, H. Durrant-Whyte, Simultaneous localization and mapping (SLAM): part
in Construction, 2012, https://doi.org/10.22260/ISARC2012/0049 Eindhoven, II, IEEE Robot. Autom. Mag. 13 (3) (2006) 108–117, https://doi.org/10.1109/
Netherlands. MRA.2006.1678144.
[4] K. Tanaka, M. Kajitani, C. Kanamori, H. Itoh, Y. Abe, Y. Tanaka, Development of [17] M.G. Dissanayake, P. Newman, S. Clark, H.F. Durrant-Whyte, M. Csorba, A solution
marking robot working at building sites, Proceedings of the 12th International to the simultaneous localization and map building (SLAM) problem, IEEE Trans.
Symposium on Automation and Robotics in Construction, 1995, pp. 235–242, , Robot. Autom. 17 (3) (2001) 229–241, https://doi.org/10.1109/70.938381.
https://doi.org/10.22260/ISARC1995/0030 Warsaw, Poland. [18] S. Thrun, Simultaneous localization and mapping, Robotics and Cognitive
[5] P. Jensfelt, G. Gullstrand, E. Förell, A Mobile robot system for automatic floor Approaches to Spatial Mapping, Springer, 978-3-540-75386-5, 2007, pp. 13–41, ,
marking, J. Field Rob. 23 (6–7) (2006) 441–459 June https://doi.org/10.1002/rob. https://doi.org/10.1007/978-3-540-75388-9_3.
20125. [19] D. Fox, S. Thrun, W. Burgard, F. Dellaert, Particle filters for mobile robot locali-
[6] Z.Z. Abidin, S.B.A. Hamid, A.A.A. Aziz, A.A. Malek, Development of a vision system zation, Sequential Monte Carlo Methods in Practice, Springer, 2001, pp. 401–428, ,
for a floor marking mobile robot, Proceedings of the 5th International Conference https://doi.org/10.1007/978-1-4757-3437-9_19.
on Computer Graphics, Imaging and Visualization, 2008, pp. 88–92, , https://doi. [20] H. Peel, S. Luo, A.G. Cohn, R. Fuentes, Localisation of a mobile robot for bridge
org/10.1109/CGIV.2008.69 Penang, Malaysia. bearing inspection, Autom. Constr. 94 (2018) 244–256, https://doi.org/10.1016/j.
[7] F. Inoue, S. Doi, E. Omoto, Development of high accuracy position making system autcon.2018.07.003.
applying mark robot in construction site, Society of Instrument and Control [21] X. Zhang, M. Li, J.H. Lim, Y. Weng, Y.W.D. Tay, H. Pham, Q.-C. Pham, Large-scale
Engineers Annual Conference 2011, 2011, pp. 2413–2414. Tokyo, Japan. INSPEC 3D printing by a team of mobile robots, Autom. Constr. 95 (2018) 98–106, https://
Accession Number: 12354330. [Online]. Available https://ieeexplore.ieee.org/ doi.org/10.1016/j.autcon.2018.08.004.
document/6060381 , Accessed date: 28 May 2019. [22] G. Campion, G. Bastin, B.D. Andrea-Novel, Structural properties and classification
[8] F. Inoue, E. Omoto, Development of high accuracy position marking system in of kinematic and dynamic models wheeled mobile robots, IEEE Trans. Robot.
construction site applying automated mark robot, Society of Instrument and Control Autom. 12 (1) (1996) 47–62, https://doi.org/10.1109/70.481750.
Engineers Annual Conference 2012, 2012, pp. 819–823. Akita, Japan. INSPEC [23] H.-C. Huang, W. Ter-Feng, Y. Chun-Hao, H.-S. Hsu, Intelligent fuzzy motion control
Accession Number: 13056258. [Online]. Available https://ieeexplore.ieee.org/ of three-wheeled omnidirectional mobile robots for trajectory tracking and stabi-
document/6318554 , Accessed date: 28 May 2019. lization, Proceedings of 2012 International Conference on Fuzzy Theory and Its
[9] E. Ohmoto, F. Inoue, S. Doi, Marking system applying automated robot for con- Applications, 2012, pp. 107–112, , https://doi.org/10.1109/iFUZZY.2012.6409684
struction site (in Japanese), Reports of the Technical Research Institute Obayashi- Taichung, Taiwan.
Gumi Ltd, No76, 2012, pp. 1–7 [Online]. Available http://warp.da.ndl.go.jp/ [24] ROS org Homepage, [Online]. Available http://wiki.ros.org/ja , Accessed date: 1
info:ndljp/pid/7860137/www.obayashi.co.jp/technology/shoho/076/2012_076_ April 2019.
33.pdf , Accessed date: 4 April 2019. [25] Z. Zhang, A flexible new technique for camera calibration, IEEE Trans. Pattern Anal.
[10] DPR construction, Laybot Shows Promise in Speed and Accuracy, [Online]. Mach. Intell. 22 (11) (2000) 1330–1334, https://doi.org/10.1109/34.888718.
Available https://www.dpr.com/media/news/2013, (2013) , Accessed date: 22 [26] Scicos Homepage, [Online]. Available http://www.scicos.org/ , Accessed date: 1
February 2018. April 2019.
[11] T. Kitahara, K. Satou, J. Onodera, Marking robot in cooperation with three-

11

You might also like