You are on page 1of 23

Article

Remote Indoor Construction Progress Monitoring


Using Extended Reality
Ahmed Khairadeen Ali 1,2 , One Jae Lee 2 , Doyeop Lee 1 and Chansik Park1, *
1 School of Architecture and Building Sciences, Chung-Ang University, Seoul 06974, Republic of Korea
2 Haenglim Architecture and Engineering Company, Republic of Korea; o.lee@haenglim.com
* School of Architecture and Building Sciences, Chung-Ang University, Seoul 06974, Republic of Korea:
cpark@cau.ac.kr

Version February 6, 2021 submitted to Journal Not Specified

1 Abstract: Construction progress monitoring has undergoneAH recent expansions by adopting vision
2 and laser technologies. However, inspectors need sAH to personally visit the job sitejob-siteAH
3 or wait for a time gap to process data captured from the construction site to use for inspection.
4 Recent inspection methods lacksAH automation and real-time data exchange, therefore,AH it needs
5 inspection manpower for each job site, the health risk of physical interaction between workers and
6 inspector, loss of energy, data loss, and time consumption. To address this issue, a near real-time
7 construction work inspection system called iVR is proposed; this system integrates 3D scanning,
8 extended reality, and visual programming to interactively display the site for inspection indoors
9 and provide numeric data.to visualize interactive onsite inspection for indoor activities and provide
10 numeric data.AH The iVR comprises five modules: iVR-location finder (finding theAH laser scanner
11 located oninAH the construction site) iVR-scan (capture point cloud data of job site indoor activity),
12 iVR-prepare (processes and convertsAH 3D scan data into a 3D model), iVR-inspect (conductsAH
13 immersive visual reality inspection insideinAH construction office), and iVR-feedback (visualize
14 inspection feedback from job site using augmented reality). An experimental lab test is conducted
15 to verify the applicability of iVR processesAH ; it successfully exchanges requiredAH information
16 between theAH construction job site and office in a timely mannerspecific timeAH . This system
17 is expected to assist eEAH ngineers and workers in quality assessment, progress assessments, and
18 decision makingdecision-makingAH which can realize a productive and practical communication
19 platform unlike conventional monitoring techniques, which require storage, compatibility, and
20 time-consuming processes.which can realize a productive and practical communication platform,
21 unlike conventional monitoring or data capturing, processing, and storage methods, which involve
22 storage, compatibility and time-consumption issues.AH

23 Keywords: : virtual reality; point cloud data; building information modeling; work inspection;
24 progress monitoring; quality assurance inspection.)

25 1. Introduction
26 The construction industry has devoted significant effortsAH toward sustainable construction by
27 utilizing automation and vision technologies [1,2]. Especially, cCAH onstruction progress monitoring
28 has been leveraged to increase accuracy and time efficiency [3,4]. This includes measuring the progress
29 through site assessments and comparisons with the project plan, with the;in these methodsAH , the
30 quality of progress data beingAH completely dependentdependsAH on the expertise of inspection and
31 the measurement quality [5].
32 In generalGenerallyAH , manual visual observations and conventional progress tracking can
33 be used to provide feedback on progress measurementsAH [3], productivity ratesAH [5], tracking

Submitted to Journal Not Specified, pages 1 – 23 www.mdpi.com/journal/notspecified


Version February 6, 2021 submitted to Journal Not Specified 2 of 23

34 devices and materials [6], and safety planning [7]. Work inspection engineers typically adhere to
35 guidelines provided by construction quality control institutes like the International Organization
36 for Standardization (ISO). One of the mainprimaryAH aims of inspection is to evaluate dimensional
37 (dimension and position) errors and surface defects (cracks and spalling) in activities involving concrete
38 columns and HVAC pipes [8]. However, conventional methods for construction work inspection are
39 time-AH consuming, needs inspection manpower [9], consume energy, economically, inefficient and
40 prone to errors [10]. Therefore a more sustainable and automated construction progress monitoring
41 needs to be developed to operate at a lower cost, less manpower, and a healthier construction
42 environment by reducing the physical interaction between theAH construction worker and theAH
43 inspector.
44 According to the Korean statistical information servicesIn South KoreaAH ,
45 194,947 new construction projects started in 2019 graciously according to
46 the Korean statistical information servicesAH . Considerable amounts of these buildings are
47 small construction projects. Inspection and continuedAH monitoring of this considerable number of
48 projects consume a lot of manpower, time, data management, andAH travel between sites thatwhich
49 willAH causesAH pollution, highAH energy and economic lossesAH , and higherAH risk [1,9]. Therefore,
50 remote monitoring of multiple activities at the same time using extended reality (XR)AH technologies
51 might reduce the number of inspectors needed [11].
52 Building information modeling (BIM) technology, which is routinely used in the construction
53 industry, is a potential solution for achieving automation in work inspection. BIM is a rich source of
54 3D geometry-related information ,such aslikeAH mechanical, electrical, plumbing, architectural, and
55 structural data and; BIMAH also facilitates theAH knowledge-sharing of building life cycles and resolves
56 interoperability issues [2,11]. The advancement of contemporary technologies in project delivery
57 has led to the cognitive development of faster and more efficient digital models [11]. These modern
58 technologies employ digital platforms viaofAH computers, mobile devices, sensors, broadband network
59 connections, and application platforms [10], which facilitate the sustainable growth of activities and
60 relationships around the management office and job site duringinAH the construction stage.
61 Although researchers are developing techniques to employ augmented reality (AR) for monitoring
62 construction progress, existing practices rely on fairlyAH manual and fairlyAH time consuming
63 inspections to assess the progress at most construction sites. However, these approaches are prone
64 to factual errors [12] becauseMoreover,AH even after implementing BIM software, the data fromthe
65 software is not updated in real-time betweenAH the construction office and the construction site are
66 not updated in real-timeAH . This canAH lead to a lack of creativity arisinganticipated to ariseAH from
67 digitization and the use of 3D technologies [3].
68 It is challenging for engineers at theAH construction sites to manage complexcomplicatedAH
69 BIM models and recognize the necessary attributes thereincontained in digital modelsAH . As BIM
70 models are maintained on servers or separate computers, they cannot be synchronously modified
71 at theon aAH job site. Thus, information transfer from the design office to the engineering office at
72 the construction site can beis delayedAH considerably delayedAH [13], replaced[id=AH, ]whichThis
73 delay can be critical for instancein some projects, such asAH fast-tracked projects where the design is
74 developed simultaneously with the construction. Slower data exchange can result in the slowdown
75 or reworking of the project [5]. Therefore, developing a method for near real-time monitoring of
76 construction projects and bridging the gap between indoor job site activities and theAH construction
77 office will fulfillrepresentAH represent an urgent need in the construction industry [13]. These activities
78 are conducted inside an enclosed space, limited to theAH conditions of the place it is held in, the number
79 of workers participating , thein it,AH objects being constructed, among others activitiesAH .
80 In thisThisAH article, weAH propose one platform that can host multiple monitoring tasks
81 and transfer data between them smoothly. This platform comprisesproposeAH a closed loop of
82 communication between theAH construction job sites and constructionAH offices during the inspection
83 process. The aim of this study is toThis study aims toAH minimize the gap between the job sites and
Version February 6, 2021 submitted to Journal Not Specified 3 of 23

84 construction offices by developing a system that can realize work inspection and quality assessment
85 of data in near real-time for indoor activities. The developed system prototype can 1) capture point
86 cloud data from constrictedof constrictionAH indoor activities, 2) convert the point cloud into a 3D
87 model, 3) monitor theAH progress in an office using virtual reality (VR), and 4) enable workers at the
88 job site to visualize the inspector’s report in AR. Moreover, two case studies of column and HVAC pipe
89 quality assessment wereareAH successfully implemented to test the applicability of the iVR process, for
90 whichand theAH results of the system were registered. The scope of this research is limited to targeted
91 indoor activities and excludes other monitoring aspects such as workers, surroundings, and temporary
92 facilities.

93 2. Literature Review
94 To demonstrate the integration of 3D scanning and XRextended realityAH for onsite progress
95 monitoring, wethis studyAH considered various technical challenges of 3D laser scanning and
96 XRextended realityAH , including interdependency, spatial site layout collisions, analyses, andAH
97 management, digital-to-physical linkingAH , project control, and design visualization during
98 production.
99 In the construction processes, the term ’sustainability’ (connotes ’efficiency’) is considered as a
100 that considers as theAH key aspect for integrating building automation systems andinAH construction
101 activities. TheseAH include progress monitoring to reduce theAH workforce, increasing efficiency,
102 and creating a healthier and safer environment by minimizing theAH physical interactions between
103 inspectors and workers at the job site [1,4]. In more detail, a point cloud is applied in different phases
104 with various targets, including planning and design, manufacturing and construction, and operation
105 and maintenance. In the planning and design stage, a point cloud is used to reconstruct 3D models of
106 the site and existing buildings [14,15]. In the manufacturing and construction phase, a point cloud is
107 applied for automating construction equipment [13], digital reproduction [4,16], identification of safety
108 hazards [17], monitoring of construction progress [18], and inspection of construction quality [17]. In
109 the operation and maintenance phase, a point cloud is applied for reconstructing 3D architectural
110 structures [15], building performance analyses [19], facility management for 3D model reconstruction
111 [12], and inspection of building quality [18,19].
112 As-built point clouds can be acquired using laser scanning or image-based photogrammetry
113 methods. The generated point clouds are co-registered with the model by using an adapted
114 iterative-closest point (ICP) algorithm, and the as-planned model is transformed in to a point cloud by
115 simulating the points using the recognized positions of the laser scanner [44]. Recently, researchers have
116 developed systems that can detect different component categories using a supervised classification
117 based on features obtained from the as-built point cloud, after whichthatAH that, an object can be
118 detected if the classification category is identical to that in the model [3,20,21].
119 Utilizing cameras as the acquisition tool results in lower geometric accuracy , asAH compared
120 to that obtained by usingAH laser scanning point clouds. A few recent studies have detected spatial
121 differences between time frame images [16,22] and updated the project schedule using theAH images
122 [7], and compared this approachas-builtAH to as-planned BIM models. The Kinect 3D scanner employs
123 similar technology toasAH that used in a mid-range 3D scanner(,AH i.e., a camera and an infrared
124 camera to calculate the depth field of an object and its surroundings)AH . The two "cameras" used in
125 the Kinect enable it to scan nearly every chosen item, with decent 3D accuracy. The depth camera
126 can behas beenAH employed for the surveillance and assessment of construction sites; tracking of
127 materials, equipment, and labor [23]; safety monitoring [24]; and reconstruction [2,18]. Recently,
128 advanced technologies such as robotics and drones have been usedproposedAH in theAH construction
129 sites to aid or replace workers and aswhich needAH real-time remote monitoring technology that can
130 be integrated into itswith its ownAH system such as facade pick-and-place using robot arms [25], brick
131 laying using drones [26], humanoid robots[27], and integrating theAH site terrain with a BIM model
132 [28].
Version February 6, 2021 submitted to Journal Not Specified 4 of 23

133 The construction industry has explored reconstruction architecturesAH using active scanners, 3D
134 laser scanners, and passive cameras, althoughHoweverAH such equipment has not been implemented
135 in real-world applications thus far. Moreover, state-of-the-art methods necessitate high computational
136 resources to analyze 3D points and varying dynamic scenes. However, the incorporation of Kinect
137 low-cost scene scanners can enable the rapid replication of 3D models with improved size and
138 resolution. Spars maps have been used in severalSeveralAH simultaneous localization and mapping
139 systems have used sparse mapsAH to locate and focus on real-time tracking [24]. Moreover, other
140 methodologies have also been used for reconstructing simple point-based representations [15]. Kinect
141 cameras exceed such point-based representations by reconstructing surfaces that approach real-world
142 geometry, with greater reliability [2,29].
143 The XR used in theextended reality inAH construction progress monitoring for i IAH mmersive
144 simulationsAH improves the comprehension of dynamic construction processes and promotes the
145 assessment of different project conditions with minimal cost and intervention [30]. On top of that, b
146 BAH uilding professionals and researchers perceive the value of simulations and have been exploring
147 the use of VR for interactive visualization. Potential experiments in this area include construction defect
148 management [31], information delivery in the excavation stage [32,33], andAH dynamic visualization
149 using GPS [34,35]. Cloud-based computing isrepresentAH another field that has enabled real-time
150 coordination and empowered project teams to expand BIM from the design to theAH planning stage.
151 BIM-based platforms have recently facilitated real-time collaboration in cloud computing. However,
152 cloud-based BIM data exchange protocols, resource management, and connection to construction
153 sites howeverAH lack real-time and user-friendly tools as well as sufficient advancement in. Research
154 review advancesAH the principles of construction scheduling automation.
155 Although laser scanning data have been widely utilized for theAH dimensional and surface
156 engineering in various civil applications and project quality checking, it is important to exploreAH
157 its integration with visual algorithms and XRextended realityAH technologies is important to explore
158 moreAH to encourage remote monitoring without the need to visit a job site in person. Therefore, wethis
159 paperAH propose an a laser scanning-based,AH accurate and reliable laser scanning-basedAH technique
160 for near real-time assessment of dimensional and surface quality to speed up the transfer of knowledge
161 reduce the gapAH between construction offices and job sites by integrating theAH Kinect 3D scanning
162 and XRextended realityAH .

163 3. Methodology
164 The proposed framework incorporates knowledge transfer from the job site to the construction
165 office and vice versa. The core feature of this framework is to integrate a monitoring device with
166 XRextended realityAH technology in one platform that synchronizes the informationAH between the
167 construction office and the job site in near real-time. The following research questions are addressed
168 askedAH : 1) How can the inspector monitor indoor activity remotely from the construction site using
169 XR? 2) How can one implement to erectAH one platform that can capture activity 3dAH information
170 in 3D andAH dispatch it to the construction office, as well asAH generate an inspection report and
171 send it back to the construction site in near real-time? 3) Can VR and ARThree, can we utilize virtual
172 and augmented reality for activity inspectionAH in an integrated platform be utilized for activity
173 inspections?AH .
174 The research methodology is divided into three sections: 1) The inspection checklist, which
175 explains the method to build the inspection agenda in VR; 2) The laser scanner and XRextended
176 realityAH -based quality inspection procedure (iVR), which addresses the system architecture of the
177 proposed methodology; 3) Data storage and data compatibility using, which discussesAH innovative
178 methods to storeof storingAH data and exchange themAH between theAH job site and the construction
179 office. The proposed system, iVR, comprises the following five modules: 1) iVR-location finder that
180 optimizes laser scanner device location in the physical model; 2) iVR-scan which captures point cloud
181 of target activity using 3D laser scanners; 3) iVR-prepare converts point cloud into a 3D mesh geometry;
Version February 6, 2021 submitted to Journal Not Specified 5 of 23

Table 1. The inspection checklist for indoor geometry

Quality category Feature Attributes (tolerance


(1) (2) (3)

Dimensions Length, width, height, thickness (±3 mm)


Geometry Position Horizontal and vertical Length (±6 mm)

Spalling Number, location, volume


Defect Cracks Depth; widthdimensionAH (± 0.3 mm), location
Flatness Number, size (±6 mm), location

182 4) iVR-inspect that let site engineer generate a quality check and feedback report using VR, and 5)
183 iVR-feedback visualizes of inspection feedback in job site using the AR (iVR-feedback) module.
184 Generally, AH iVR-location finder helps theAH inspector toAH find the best laser scanner position
185 to monitor the targeted activity in the construction site and view how the point cloud data will appear
186 look likeAH before starting theAH monitoring activity. startingAH iVR-location finder then passes the
187 position coordinates to iVR-scan so that the inspector can position the laser scanner in theAH correctly
188 at the physical site following digital position optimization using the iVR-location finder module. Next,
189 theAH iVR-scan starts capturing point cloud data for the target activity and passes it to iVR-prepare,
190 which converts the received data into 3D mesh geometry and sends it from the construction office to
191 iVR-inspect in the construction office. After that, the inspector checks themeasuresAH progress and
192 quality control in VR mode and sends the inspection report back to the construction site. Finally, the
193 worker views the inspection report in AR hosted inside the iVR-feedback module.

194 3.1. Inspection Checklist


195 Before performing progress monitoring, it is essential to identify the inspection checklist, which
196 includes the attributes of activity elements that need to be inspected and their required degree of
197 accuracy. Generally, tAH The construction specifications for a project play a major role in determining
198 the inspection checklist; they detail the quality requirements for each component of the construction
199 project and can be translated into the inspection checklist. In this study, the inspection checklist is
200 determined based on the tolerance manual for object placement and casting tolerance in the guide
201 for concrete columns provided by the American Concrete Institute. Table 1 providesdisplaysAH the
202 checklists listed, whichAH comprising two sections: geometry and defects. Two characteristics are
203 covered in the geometry category: the dimensions and location of the concrete components. Each
204 dimensional function referred to as an "attributeAH has a checklist. For instance, height, width, and
205 thickness are characteristics of the “scale” function. The resistance of each feature differs according
206 towithAH the form of tje product design, size, and duration. It also represents the tolerances of afor
207 an exemplaryAH concrete column and HVAC pipe. Each dimensional attribute should be carefully
208 checked to ensure that its value lies within the predefined tolerance. Otherwise, the dimensional
209 errors in each predetermined variable will aggregate in the prefixed segment. There are three defect
210 characteristics: spalling, cracks, and flatness.

211 3.2. The Extended Reality Based Progress Monitoring System (iVR)
212 After determining the checklist and required tolerances for inspection, the resultinganAH
213 inspection protocol must be conducted. The iVR system comprises five modules: (1) Finding best scan
214 camera location (iVR-location finder); (2) Capturing the site model using 3D laser scanners (iVR-scan);
215 (3)Converting in whichAH point cloud into a 3D mesh (iVR-prepare); (4)Qwhich involves the qAH uality
216 checking by the site engineer and the generating a feedback report using VR (iVR-inspect); (4);
217 (5) Visualizing the in which theAH inspection feedback is visualizedAH at the job site using AR
218 (iVR-feedback). In more detail, 3D laser scanners are used to capture point clouds for theofAH
Version February 6, 2021 submitted to Journal Not Specified 6 of 23

Figure 1. System architecture for real-time information flow between the construction offices and sites
(near real-time monitoring of a single activity).

219 activities, that are then converted into 3D models and sent to the BIM cloud of the construction office
220 by the iVR-prepare algorithm. Subsequently, iVR-inspect enables engineers to trace workflow, compare
221 current project progress with blueprints, measure objects, and add text/design notes to 3D models,
222 to leverage the quality of theAH project site management. Finally, iVR-feedback sends inspection
223 reports to job siteAH workers at the job site who can visualize it in AR mode integrated with graphical
224 algorithms as illustrated in Figure 1. The iVR system has predefined settings that need to be set before
225 starting the monitoring system. These are matching the scanner position in the AR and BIM models,
226 determining the field of view, setting the camera resolution quality, color coding activity objects, setting
227 the BIM model level details, and creating the work breakdown structure. The point cloud density
228 needs to be set before the monitoring process to prevent geometry data overload during data transfer
229 between the construction site and office.

230 3.2.1. TheAH iVR-location finder Module


231 In addition to the predefined setting requirement list, we developed a tool called iVR-location
232 finder to helpAH the inspector determine the camera location before the monitoring activity called
233 iVR-location finderAH . It consists of a list of options to help the inspector control the location of the
234 scanning camera to monitor the targeted activity in the construction site in the planning stage of the
235 proposed iVR system. These options are choosing a laser scanner type and brand from a drop-down
236 list, selecting the date of the target activity from this tool’s calendar (Figure 2(1)) and setting the
237 laser scanner field of view (FOV), FOB window sphere visible and blocked angles, laser scannerAH
238 and camera view in the BIM environment , laser scanner field of view window sphereAH . Once the
239 inspector chooses the laser camera type and date, iVR-location finder generates an optimized FOV
240 location. The optimized location is found by a) dividing working area Ad into equal isocurves, where
241 Ad is the daily construction activity working space in closed boundaries, with d being the day selected
242 from the workspace breakdown structure schedule; b) calculating the arithmetic mean points of each
243 isocurve; c) connecting the arithmetic mean points with an interpolated curve to generate the activity
244 area center-line (CL); and d) dividing CL by the camera radius (CR) divided by k ∗ CR to generate a
245 double amount of (Ni + 1) of cameras needed for Ad , where k is greater than CR. In this research, k
Version February 6, 2021 submitted to Journal Not Specified 7 of 23

Figure 2. iVR-Location finder module data process used to optimize camera location in the predefined
set of iVR system.

246 was set as 2.01, which is half the CR with 0,01 added amount of (0.01)AH to ensure that the camera
247 FOV is covering all Ad . Thus, the number of CR needed for Ad isAH

CL
Ni = Roundup( CR
) (1)
k ∗CR

248 where CR is the radius of camera field of view and Ni is number of CR needed for Ad .AH After that,
249 CP is culled by True or False logic to divide Ni by two and obtain optimized CP, and finally FOV, CR
250 and CP are drawn in the BIM model as shown in Figure 2(3). All iVR-location finder process data that
251 consist of FOV, Ni, Li, CR and CP are overlaid in the BIM model. The physical location of the camera
252 is manuallyAH synchronized with the digital model to ensure an accurate overlay of the as-built data
253 with the as-planned data of BIM model in the iVR system.

254 3.2.2. TheAH iVR-scan Module


255 Therein phase, AH A laser scanner was placed at the job site to monitor a specific activity and send
256 live scanning data to the construction office. The location, direction, type of laser scanner, software, and
257 hardware required to accomplish successful scanning are the primary focus of this phase as illustrated
258 in Figure 3. The coordinates and 3D model of the construction site need to be recorded and registered
259 in the BIM environment in advance. Subsequently, the inspector defines a clearly visible location for
260 the laser scanner via the iVR-location finder module.
261 The scanning duration depends on the following variables: 1) the importance of the activity
262 and the number of times itthe activityAH needs to be monitored by the project engineer; 2) the daily
263 working hour via the start and end times of daily workAH ; 3) the activity location, i.e. whether the
264 activity is conductedAH indoors or outdoors; and 4) the scale of the target object. Therefore, the
265 inspector determines the duration ofAH scanning duration during the planning stage of the activity,
266 especially in the design-build delivery method, where activity implementation occurs at a faster rate
267 than, as compared to that inAH other delivery methods. Laser scanners can be divided into two types
268 of hardware. The first typeAH is stand-alone laser scanners that are mobile and can store and transfer
269 data to the computer after the completion of the scanning process. The second is live scanning devices
Version February 6, 2021 submitted to Journal Not Specified 8 of 23

Figure 3. The iVR-scan module system architecture and data Transfer process from site to office

270 that need to be connected to a computer during the scanning process to transfer data in real-time.
271 In this study, we usedAH the second type of scanner to reduce the gap between scanning and data
272 processing. However, such scanners are difficult to implement in construction sites, as they require
273 a power source and a large amount of space,; these scanners haveAH and also have limited mobility
274 compared towithAH portable scanners. Before choosing the monitoring system, the usability, time
275 efficiency, accuracy, level of automation, required preparation, training requirements, cost, and mobility
276 of existing progress monitoring systems in the literature reviewAH were evaluated in the literature
277 reviewAH and analyzed. Subsequently, we chose Kinect V2 as the 3D laser scanner because it offers
278 live scanning (LIDAR and photogrammetry to produce Kinect 3D scans). The research component
279 was used withbuilt onAH a combination of cameras and sensors for target observation and distance
280 recognition. The process of scanning the target object is as follows: 1) set up a monitoring camera
281 in the construction site and direct it toward the target object following Li of the iVR-location finder;
282 and 2) connect Kinect V2 to the PC Grasshopper graphical algorithm editor incorporated with the
283 commercially available Rhinoceros software. Three point cloud libraries are used to manage the point
284 cloud data and convert them into points, colors, and GPS coordinates to run the produced model in
285 real-time as depicted in Figure 3. The registered point cloud in iVR-scan is then passed to iVR-prepare
286 to start converting the point cloud to 3D mesh geometry. iVR-scan sets up a timer to release live
287 frames of the point cloud every 100 ms and pass it to the next module to control data processing and
288 computation.

289 3.2.3. The iVR-prepare Module


290 iVR-prepare receives point cloud data from iVR-scan every 100 msMilliseconds (ms) time-lapseAH .
291 It then filters and converts the point cloud detail to a 3D mesh model and sends it to iVR-inspect
292 using a custom cloud-based data transfer systemAH . iVR-prepare recieves thereceivedAH point cloud
293 data and RGB-D images from iVR-scan every 100 ms and process it.AH The inspector then sets the
294 target object boundary and uses iVR-prepare to crop the targeted object from the scanned scene using
295 color coding to reduce the amounts ofcomputationalAH data and computational time required for
296 the subsequent steps. iVR-scan erases all geometries located outside the targeted bounding box set
297 beforehandpriorAH by the inspector and only keeps the point cloud data located inside it. The module
Version February 6, 2021 submitted to Journal Not Specified 9 of 23

Figure 4. The system architecture of the iVR-prepare module. (a) Point cloud data are received from
iVR-scan at the job site. (b) A bounding box is used to cover the target activity. (c) The target activity
point cloud is cropped to remove the rest of the scene data. (d) Voxel shapes are drawn around the
point cloud to help detect neighboring points. (e) 3D mesh geometry is generated from the point cloud
using bull pivot. (f) The 3D mesh geometry and bounding box are merged.

298 then creates a uniform voxel grid structure to initiate creating a 3D mesh object from the point cloud.
299 iVR-prepare employs mesh marching mathematical logic and the ball-pivoting algorithm (BPA) to
300 convert the point cloud into a 3D mesh object; f) iVR-prepare employs a bounding box to surround the
301 produced 3D mesh to reduce the amount of data and computational time. Last, the module sends the
302 final 3D mesh geometry from the construction site PC to the office PC using Speckle Cloud as depicted
303 in Figure 4.
304 We used BPA to reconstruct a mesh from a point cloud. This logic employs a sphere to roll across
305 the "surface" of the point cloud’s Delta p. To cover two points inside its boundary, the sphere scale is
306 slightly larger than the distance between them. The sphere begins from two points and starts rolling to
307 include additional points to generate the outer shell of the mesh. The sphere rolls and pivots along
308 the edge of the triangle shaped with the two lines and then settles at a new location. A new triangle
309 is created from the two previous vertices and a new point is attached to the mesh, thereby forming
310 an expanded triangle. As the sphere continues to roll and rotate, additional triangles are shaped and
311 connected to the mesh until the shape of the mesh is complete, as shown in Figure 5.
312 The subsequent sections present a 2D BPA visualization, wherein a sphere settles among pairs of
313 points. The two points are connected each time the sphere settles to fill in the mesh. In 3D cases, the
314 sphere is connected among three points as shown in Figure 5.
315 The sphere radius, θ , is obtained empirically based on the size and scale of the input point cloud:

θ = 1.10 ∗ ( Average Distance between ∆p/2) (2)

316 The average distance among neighboring triple points, ∆p, is divided by 2 to obtain the radius
317 of the sphere and then increased by multiplying it bywithAH (1.05–1.25) to ensure that θ includes ∆p
318 within its boundaries. All points need to be obtained within a distance of 2θ from the starting point
319 p. We used the voxel cube ν data structure with a side length of 2θ toAH control the variance of 2θ
Version February 6, 2021 submitted to Journal Not Specified 10 of 23

Figure 5. Illustration of BPA and generation of closed-mesh objects from the point cloud.

320 between neighboring points, The coordinates of given position p within voxel ν comprise a triplet
321 follows:

Px − νx Py − νy Pz − νz
([ ], [ ], [ ]) (3)
2θ 2θ 2θ
322 To populate voxel ν on ∆p, the above-mentioned equation mustis developedAH to fit the visual
323 algorithm developed in iVR-prepare. For a point in∆p, the x coordinate Px is subtracted from the voxel
324 corner pointAH x coordinate νx and divided by 2θ (the same is conducted for Py andPz ). The generated
325 voxel of the outer shell of the mesh is then relaxed to produce a smooth surface as depicted in Figure 5
326 iVR-prepare conducts an empirical analysis of the mesh produced from the point cloud and
327 compares it with the BIM model to calculate the progress rate and quantify the achievement during
328 that activity, as shown in Figure 4(d and f). Besides, iVR-prepare stores the progress data in a report
329 and attaches it to iVR- feedback for AR visualization. the developed visual algorithm provides an
330 option for the inspector to download the progress data to the local machine in Excel format.
331 iVR-prepare is limited to indoor construction activities that contain one or multiple objects and
332 can be monitored by the FOV of a Kinect camera. It converts targeted object-captured point clouds to
333 a light 3D mesh and sends it to the construction office. After that, Speckle Cloud is used to transfer
334 the point cloud data from the job site to the construction office. Subsequently, the Grasshopper and
335 Rhinoceros algorithms synchronize and update the point cloud for each period.

336 3.2.4. The iVR-inspect ModuleiVR-inspect moduleAH


337 AfterIn this module,AH the 3D mesh is received from the job site via iVR-prepare using the
338 Speckle Cloud data receiver, it is aligned with the BIM model. The Kinect camera and 3D model in the
339 construction site and office are aligned as per predefined settings. The project engineer connects the
340 VR headset to the Rhinoceros environment and begins inspecting the progress data received from the
341 job site in VR.
342 The iVR-inspect procedure is as follows. 1) Align the 3D mesh built-in iVR -prepare with the BIM
343 model. 2) Connect a VR headset to the immersive reality and set up the quality, scale, and colors of
344 different models. 3) Monitor the progress, building accuracy, and quality. 4) Use the VR inspection
Version February 6, 2021 submitted to Journal Not Specified 11 of 23

Figure 6. The system architecture of the iVR-inspection module illustrating inspecting work progress
inspectionAH using virtual reality in the construction office.

345 tools provided by Mindesk in VR mode, which include writing comments, highlighting objects, taking
346 measurements, comparing the BIM model to the progress model received from iVR-prepare, and
347 providing illustrations as feedback. 5) Send the monitoring report that includes comments, drawings,
348 and geometry to iVR-feedback. The procedure is illustrated in Figure 5.
349 This study employed atheAH VR inspection method and BIM, instead of conventional site visits
350 and point cloud screen checking to bestow the, to awardAH project engineer with the advantages of
351 immersive reality, thereby improving the inspection rate, speed, and accuracy, as well as the feeling of
352 being present at the job site even when sitting in a construction office, as illustrated in Figure 5. We
353 used the Mindesk package of Grasshopper to control the iVR-inspect module. This enables parametric
354 modeling in VR space, and so 3D CAD models immersed in VR can be built, edited, and navigated
355 without exporting the Rhinocerous project by using this package. The current tools used in iVR-inspect
356 include 3D navigation, multi-scale view, geometric editing, non-uniform rational basis spline (NURBS)
357 editing, surface modeling, curve modeling, text tagging, and note drawing. Mindesk can function
358 with HTC Vive™, Oculus Rift™, and Windows Mixed Reality™ devices and can also incorporate AR
359 into the same platform. However, we only focused on its VR-related aspects in this studyAH .
360 After performing progress and progress checks, the project engineer canAH provides illustrations,
361 and adds comments. Afterward,AH iVR-inspect sends the multilayer data, including the BIM model,
362 overlaid model, comments and illustrations, and accomplishment report, to iVR-feedback,AH as
363 depicted in Figure 6.

364 3.2.5. The iVR-feedback ModuleiVR-feedback moduleAH


365 iVR-feedback focuses on visualizing the engineer’s reports using mixed reality (MR) at the job
366 site by utilizing AR to synchronize the BIM model at the construction office with the worker’s cell
367 phone in real-time. The workflow of iVR-feedback includes the following steps. 1) The worker’s cell
368 phone is registered with the construction office’s Fologram tool to receive live data. 2) The engineer
369 sends the BIM model, comments, illustrations, accomplishment report, and overlaid model via the
370 Fologram tool to the worker’sAH cell phone at the job site. 3) The worker at the job site visualizes the
371 received data in MR using the cell phone. The procedure isAH illustrated in Figure 7.
Version February 6, 2021 submitted to Journal Not Specified 12 of 23

Figure 7. The system architecture of the iVR-feedback module.

372 An immersive holographic instruction set is generated using parametric models representing the
373 architecture. Fologram, which is included in Rhinoceros and Grasshopper language, synchronizes the
374 geometry with that developed in MR devices linked via a local Wi-Fi network. If a consumer makes
375 improvements to a pattern in Rhinoceros and Grasshopper, these modifications are observed and
376 transmitted to all of the linked MR apps. Using familiar and efficient desktop CAD tools, the users can
377 view digital models in a physical setting and on a scale when making improvements to such models.
378 The placement of the 3D model is controlled by placing a QR code on the Rhinoceros canvas in the
379 construction office and the corresponding code in the job site. Once workers receive the data, they can
380 visualize the QA inspector’s comments and overlay the model in a near real-time manner. This system
381 requires that the worker’s phone is registered with the construction office’s Fologram tool and cell
382 phones at both the construction office and job site need to be connected to the same Wi-Fi network.
383 As a result, engineers and workers at the job site can function synchronously and receive and send
384 near real-time data between the construction siteAH and office using XRextended realityAH and 3D
385 scanners associated with visual programming algorithms, as shown in Figure 7.

386 3.3. Data Management


387 To address the application interoperability issue primarily induced by discrepancies in the
388 application format, we usedAH the Rhinoceros platform and the (.3dm) format as a unified data
389 format between differing software and hardware platforms. The entire hardware platform was directly
390 connected to the Rhinoceros modeling platform and then synchronized using Speckle-AH Cloud-based
391 data-sharing. At the job site, Kinect V2 was connected to Grasshopper and Rhinoceros, and its
392 point cloud data and resolution parameters were managed by and stored in it. Subsequently, all
393 of the captured point cloud data are sent to the construction office’s Grasshopper and Rhinoceros
394 platform using the Speckle Cloud tool, which maintains the synchronized and updated models in both
395 computers. After that, the inspector assesses progress and associates the report with the model. Finally,
396 the feedback data are sent back to the worker’s cell phone at the job siteAH using the AR synced with
397 the office BIM model.
Version February 6, 2021 submitted to Journal Not Specified 13 of 23

Figure 8. iVR-location finder’s user interface and implementation in the column finishing construction
case study. (a) The user interface of iVR-location finder, including the scanner location finder
parameters, and (b) possible results.

398 4. Case StudiesstudyAH


399 To validate the feasibility of the proposed iVR system, a laboratory test was conducted to simulate
400 a specific work inspection scenario between the constructionAH site and office. Two case studies were
401 built to test iVR: the first was a column finishing monitoring activity and the second was an HVAC
402 pipe installation monitoring activity.

403 4.1. Case SsAH tudy 1: Column Finishing Activity Monitoring


404 The first case study consistedAH of a 4 × 4 m grid of concrete columns 0.4 m high and 0.4
405 m wide. Concrete columns areAH They were designed in a BIM environment and represented by
406 Styrofoam boxes in the laboratory to resemble a job site. Before starting the work inspection experiment,
407 iVR-location finder was used to optimize the Kinect camera location. After that, all four phases of
408 iVR were implemented, and the time consumed, accuracy, and practicality of these junctures were
409 recorded for further development of the system. In this phase, a laser scanner was placed within range
410 of the target object (concrete columns) using iVR-location finder and the coordinates of its focal point
411 were registered for synchronization with its corresponding location in the BIM environment. Kinect
412 V2 was chosen for this case study and its parameters embedded in the iVR-location finder tool are
413 outlined in Table 2. Kinect V2 has a horizontal range of 0.3–4500 mm, and so one laser scanner location
414 could cover two columns in this case studyAH (Figure 8); the laser scanner’s field-of-view window
415 sphere component in each direction was at 292°.
Version February 6, 2021 submitted to Journal Not Specified 14 of 23

Figure 9. Case study 1 setup. (a) The virtual BIM model of column structure consisting of 12 columns
0.4 m high and 0.4 m wide inAH a 4 x 4 m grid. (b) The laboratory test physical model comprising 3D
scanners, the targeted activity (one column), and the building boundary lines. (c) The iVR-location
finder tool used to optimize the 3D scanner locations.

Table 2. Technical specification of the Kinect sensor used in a case study.

Property Specification
Kinect working window (FOVfield of viewAH ) 58° horizontal, 45° vertical, 70° depth
58° horizontal, 45° vertical, 70° depth graphic
Spatial x/y resolution at(@AH 2m distance from sensor ± 6mm
Depth z resolution at(@AH 2m distance from sensor ± 6mm
Frame rate 1 frame per 15 minutes
Operational range 0.8meter - 3.3meter
Power consumption 2.25W DC
Operating environment (every lighting condition) Indoors

416 The first case study was to monitor the finishing work on columns. The column structure consists
417 of 12 columns 0.4 wide and 0.4 high distributed as 4 by 3 in a 4 x 4 m grid, as shown in Figure 9(a).
418 One column was monitored by placing the Kinect camera in the physical position corresponding
419 to the digital position generated using iVR-location finder, as shown in Figure 9(b). As part of
420 predefined settings. iVR-location finder was used to generate the camera position in the digital model.
421 iVR-location finder needed four 3D scanners at the job site to monitor 6 targeted columns, the FOVAH
422 of the camera was 297°, the blocked FOV was 63°, and the laser camera view in the BIM environment,
423 as illustrated in Figure 9(c).
424 The laboratory-basedAH experimental work was divided into five stages in correspondence with
425 the five iVR system modules. The laboratory test was commenced by targeting one column for 3D
426 laser scanning, capturing the 3D geometry vertices, and registering the point cloud using Kinect
427 V2. The Qualla library was used to determine the resolution of the point clouds and to manage the
428 computational data processing and the time required. For the next step, a cropping box was built to
429 include only the targeted objects (concrete columns in this case study) to reduce the computational
430 time and to exclude unnecessary objects scanned from the point cloud, as depicted in Figure 9.
Version February 6, 2021 submitted to Journal Not Specified 15 of 23

Figure 10. Column lab test case study for work inspection using the iVR system all phases

431 iVR-prepare registered theAH point clouds in each interval and updated the existing point clouds
432 in a loop. The interval for updating the existing point clouds was set to 15 min. As explained in the
433 Methodology section, iVR-prepare converted the concrete column point clouds into a 3D mesh by
434 using pixel-aligned volumetric avatars (PVA). Thereafter, the 3D mesh and point cloud were sent
435 to the construction office using a Speckle Cloud 3D model synchronizing tool and overlaid with
436 the BIM model through the Grasshopper and Rhinoceros software. In the construction office, the
437 project engineer received the point cloud and 3D mesh data using the Speckle Cloud data receiver
438 tool of Grasshopper and Rhinoceros. We used Oculus rift S [77] for iVR-inspect. The inspector began
439 to visualize the 3D model through immersive VR, as well as taking measurements, writing notes,
440 checking progress, assessing the quality, acquiring snapshots of necessary details, and generating
441 work inspection reports. Next, iVR-feedback sent the inspector’s report back to the job site using
442 the Fologram AR tool of Grasshopper and Rhinoceros. The workers at the job site received the work
443 inspection report, which includes the BIM model, comments, drawings, accomplishment rate, and
444 overlaid model. It should be noted that external data, including the site location workspace, task
445 independence, schedules, and construction methods, can also be attached to progress monitoring
446 reportsAH . In this case study, iVR-feedback successfully visualized the feedback data in the construction
447 site and the worker could view the inspector’s comments and the overlaid model and follow the
448 necessary updated information recommended by the inspector without any physical interaction, as
449 illustrated in Figure 10.
450 Changes in the MR are introduced in models by associating the observed movement behaviors
451 parametrically with the framework generated on the device. In the event-based parametric modeling
452 interface, buttons are created and placed in the substantial space adjacent to the construction site; these
453 buttons can increase or decrease the current course of tiles to cover the columns without using the BIM
454 environment, as shown in Figure 10. The columns in the design model were clustered into courses to
455 allow building stages to display each course as needed separately and to avoid unbuilt areas of the
456 plan and thereby causecausingAH visual disturbances during development.
457 To accomplish one loop of the data processing in iVR, the time needed, precision, and file size
458 were monitored and quantified to check how the hidden layers of this system work. Before starting the
459 case study, some parameters were fixed; the targeted activity at the job site was selected; a 3D scanner
Version February 6, 2021 submitted to Journal Not Specified 16 of 23

460 was installed onsite and connected to the site computer; the office and site computers were connected
461 to the same Wi-Fi network; Speckle Cloud and VR were set up and ready to use; the worker’s cell
462 phone ID was registered on the office computer; and the QR code in the BIM model and the job site
463 was generated, placed, and synchronized in one benchmark.
464 Table 3 provides the details of the time needed to finish one loop of the iVR system. To measure
465 the pure data computational time, we excluded location finding, the inspector’s comments, and the
466 worker feedback time because they are variable depending on the person and the activity. The data
467 processing time to finish one loop of iVR for one concrete column was nearly 17 minutes, and the
468 total time to finish one loop including the inspector comments in VR, the worker visualizing in AR,
469 and iVR-location finder for the case study was nearly 40 minutes. The times reported in this table are
470 mathematically related to the computational power of the device, internet speed, data size, and nature
471 of the activity. Therefore, it would be an effective practice to calculate how much time a loop in iVR
472 will take before activity progress monitoring starts.

Table 3. Data exchange time gap and parameters to complete one loop of iVR in case study 1the case
study of iVRAH .

Module Data Time Preci- File Remarks


(sec) -sion Size
(MB)
iVR-location Camera location 300 900 28 Provide the location finder parameters
finder optimization and show them on the BIM model
Point cloud to 3 ±0.3 7 Scan the target column and send data to
job site PCpcAH mm Grasshopper (148710) points
Data downsizing 90 ±0.3 7 Crop the point cloud scene to cover only
iVR-scan
time (.sec) mm the target object.
Convert point 420 ±0.5 2.5 Use BPA to convert the point cloud data
cloud to 3D mm to a single mesh The radius of the ball
mesh =20mm resolution= 5 smooth = 1
Convert mesh to 140 ±15 0.49 Use mesh-marching to reduce mesh size
iVR-prepate
the surface mm by converting it into a hollow surface
shell only.
Sending data 80 - 0.41 Send and receive data between the
from site to construction site and office using
office Speckle Cloud.
VR hocking time 110 - 0 We assumed that VR headset was
hocked and ready to use. The time
mentioned in this row refers to the time
iVR-inspect
needed to open the VR window.
Comment and 420 ±5 2 Comment, draw, and check quality
notes (depend mm using the Mindesk library in VR mode.
on a user in
office)
Generate QR 10 ±3 - Generate a Fologram QR code and place
code mm it on the BIM model benchmark.
Sending data to 135 ±3 2 Use Fologram to send the inspector’s
smart phone mm report from the office to the site.
iVR-feedback
Align model 25 ±3 - Align the received model with the
with QR code in mm job-site QR code.
job site
Visualize in AR 630 ± 10 2 It took the worker 6 minutes to read
mode mm comments, check the inspector’s report,
and understand what to do next.
Version February 6, 2021 submitted to Journal Not Specified 17 of 23

Figure 11. The digital and physical model for case study 2. (a) The BIM model of the HVAC pipe
installation activity and space work breakdown structure divided into days 1 and 2. (b) The laboratory
test physical model with a 3D scanner and the targeted activity (HVAC pipe installation). (c) The
iVR-location finder tool used to optimize the 3D scanner locations.

473 4.2. Case study 2: HVAC Pipe Installation Activity Monitoring


474 In this case study, an HVAC pipe installation activity was targeted to demonstrateexaminedAH
475 the ability of the proposed system. Using iVR, inspectors can inspect HVAC pipes hanging on the
476 ceiling easily and perceive them from a variety of angles in different scales that are difficult to do
477 in a real job site. The Kinect camera location was generated using iVR-location finder and directed
478 toward the targeted activity, as shown in Figure 11. Next, the camera started generating point clouds
479 of the targeted HVAC pipes and surrounding objects, including the wall, ceiling, and partition window.
480 iVR-preparation then converted the point cloud into a 3D model, as shown in Figure 11. After that,
481 the inspector viewed the HVAC pipes from several angles that were difficult to view at the job site.
482 ThenAH iVR-inspect of HVAC pipe activity allowed the inspector to view the 3D model generated
483 from the site and compare it to the BIM model while sitting in the construction office using a VR tool.
484 The inspector checked for defects; wrote comments; took screenshots; and checked measurements,
485 coordinates, and views of the model from different scales and angles. Finally, iVR-feedback enabled
486 the worker at the job site to view the inspector’s report in AR mode. One loop of data processing for
487 iVR in case 2 took 11 minutes to complete, as illustrated in Figure 12.

488 5. Discussion
489 This researchWeAH started this research with an inspection checklist and predefined settings that
490 included synchronizing the physical and digital models; setting the FOV, resolution, and illumination
491 of the laser scanner; setting the BIM model level of detail; and defining the work breakdown structure.
492 In addition to the predefined settings, iVR-location finder aids the inspector to optimize the laser
493 scanner device’s position using a user interface for controlling scanner selection and work breakdown
494 structure data and for visualizing the FOV of the laser scanner in the predefined setting before
495 monitoring activity starts. The inspector positions the laser scanner in the exact position generated
496 using the iVR-location finder and synchronizes its FOV in both the physical and digital models. The
497 iVR monitoring system starts when iVR-scan registers point cloud data every 100 ms and sends it
Version February 6, 2021 submitted to Journal Not Specified 18 of 23

Figure 12. Case study 2 for the HVAC pipe work inspection using all phases ofAH the iVR system.

498 to iVR-prepare on the construction site. Next iVR-prepare converts point cloud data into a 3D mesh
499 using mesh-marching and sends the generated geometry from the construction site to the construction
500 office using Speckle Cloud. After that, iVR-insect receives 3D mesh geometry from the construction
501 site and the inspector views it by exploiting a virtual environment utilized by the Mindesk plugin. The
502 inspector compares the received as-built model from the job site with the as-planned model in the
503 office, checks the quality, measures the work progress, writes comments, draws sketches, and takes
504 snapshots of specific areas and details that need to be taken care of by the workers. He/she then packs
505 it while in VR mode and sends it to iVR-feedback. Finally, the worker receives the inspection report on
506 his smartphone, uses the AR of the iVR-feedback module to view it, and follows instructions written
507 in it. The relationship between the iVR modules and data passing between them isAH was tested in
508 two case studies, the results from which successfully demonstrated the efficacy of iVRAH .
509 The results of this research include recording the times for each step andtimingAH data transfer
510 between the modules for each case study. The time needed to finish one loop of iVR for case 1
511 from a data passing perspective was around 17 minutes and around 40 minutes when including the
512 inspector writing comments and comparing the BIM and built models, and the worker viewing the
513 inspection report in AR mode. iVR-location finder targeted one column and optimized the best laser
514 scanner location and passed location coordinates data to iVR-scan, which captured the point cloud
515 and passed it to iVR-prepare. iVR-prepare converted the point cloud data into a voxel structure (a
516 light low-resolution model with a rough surface and low thickness accuracy) and mesh format (a
517 high-precision model that takes a long time to generate). 3D mesh geometry data transfer from the
518 construction site to the office is limited to small objects only and cannot handle large-sized geometries
519 such as a complete building.
520 The results of the case studiesAH on monitoring the progress of concrete column finishing and
521 HVAC pipe laying revealed that the proposed system can support advanced and comprehensive
522 activities at construction sites. The developed iVR system successfully registered monitoring data,
523 processed these data, and enabled the inspector to generate a report and send it back to the construction
524 site as feedback. In iVR- inspection, the inspector could employ VR to view the concrete columns
525 and HVAC pipes from different angles and obtain measurements that would not be possible when
526 using conventional monitoring processes. Thus, we suggest that the use of iVR could improve
Version February 6, 2021 submitted to Journal Not Specified 19 of 23

527 decision-making in the work inspection stage and result in better execution of construction projects in
528 a shorter time with limited human involvement. Moreover, this method reduces physical interactions,
529 which could help to maintain social distancing and prevent the spread of diseases such as COVID-19.
530 By exploiting the advantages of VR, the inspector can orbit around the target activity, obtain
531 measurements, magnify the structure (1/1 scale or closer), locate areas, and precisely observe existing
532 and future conditions. Besides, difficult situations to access in real life, such as the space between a false
533 ceiling and the ceiling beams, can be monitored using this system. However, digital models may not
534 present all the required information, and so onsite verification needs to be performed remotely. Once
535 onsite dimensions are verified, they can be used to inform and guide project engineers regarding future
536 steps. Owing to the simple parametric setup of the Grasshopper environment, design engineers can
537 adjust the capacity, size, and location of each system quickly and effortlessly. This process also enables
538 the design engineer or the project manager to access the model, thereby facilitating collaboration
539 between them, and thereby provide a global model that all disciplines can observe and contribute
540 to. This workflow can be extremely helpful for projects that require the deployment of prefabricated
541 components as soon as they arrive, such as double-supported frame walls that occupy more space
542 than normal.
543 This system contributes to the automation and sustainability of construction monitoring by
544 reducing manpower, inspection costs, and logistics since the inspector is sitting in the office while
545 remotely inspecting indoor activities without needing to examine the site in person. The innovative
546 methods, algorithms, and technologies developed in this study distinguish it from previous research. In
547 general, the iVR system integrates point cloud capturing devices with XR (VR and AR) technology for
548 remote indoor construction activity monitoring into one platform. The most important contributions
549 of this study are as follows. First, a 3D laser scanner is directly connected to the BIM environment and
550 captures point cloud data inside the iVR system platform that does not need an intermediate platform
551 to handle or send point cloud data to other hardware or software devices. As many researchers
552 have stated, most current best-practice work inspection platforms require an intermediate platform to
553 convert the scanned data into point cloud or massing geometry.
554 Second, the iVR system eliminates file format compatibility issues,AH as all of theAH hardware
555 and software are connected directly to the Grasshopper graphical algorithm editor that is hosted by the
556 Rhinoceros 3D commercially available software. In contrast, the other progress monitoring methods
557 mentioned in the literature review involve different scanning devices with various formats (XYZ,
558 ply, pts, e57, las, and laz) and BIM formats (.3dm,.dwg,.obj,.ifc,.rvt, and .nwd), which can result in
559 incompatibility issues. This might even cause distortion, loss of data, and be very time consuming due
560 to the use of different versions of data processing.
561 Third, this system contributes to creating a healthy monitoring environment by minimizing
562 physical interaction between workers and inspections in the job site by enablingAH remote construction
563 monitoring from the construction office without needing to be physically at the construction site. The
564 time gap between the live data capture at the construction site and construction office is around 17
565 minutes. Although this might make our system impractical for intense activities like installing sensors,
566 CCTV cameras, and pouring fast-hardening materials onsite that need live monitoring, this time gap is
567 good enough for other less urgentordinaryAH indoor activities.
568 Fourth, in iVR, data storage is managed and controlled within the BIM environment. Kinect
569 devices, VR headsets, and Fologram are attached to the BIM environment to import/export
570 data, thusAH resulting in a lightweight and rapid system. However, in other state-of-the-art
571 progress-AH monitoring approaches, data is stored on the cloud of the scanner’s manufacturer or
572 portable devices; this is impractical because the user needs to connect all stored point cloud data to
573 another data cloud or BIM software to monitor activity progress. Besides, commercially developed
574 construction progress tools require building the model by personally visiting the job site, capturing
575 data manually, and then processing it, which could take more than 24 hours before being ready for
Version February 6, 2021 submitted to Journal Not Specified 20 of 23

576 inspection whereas the iVR system updates the as-built model automatically in near real-time (every
577 17 minutes), as reported in Table 4.

Table 4. comparing iVR system to the recent advanced commercially developed construction progress
monitoring tools

Tools As-built Data Requires SendsAH Update Uses Can


technologyAH3D Model Acquisition Data Data As-built UsingAH Compare
Capture Method Processing Between Model VR|AR (Overlap)
Before Job Site to As-planned
Inspection and Office Inspect and
on Cloud As-built
Model
FITT360 No No No No No
PersonallyAH PersonallyAH
Visit visit site
site and and
capture in capture
personAH again in
personAH
3i Yes Yes No No Yes
PersonallyAH PersonallyAH
visit visit site
site and and
capture in capture
personAH again in
personAH
cubix Yes Yes No No Yes
PersonallyAH PersonallyAH
visit visit site
site and and
capture in capture
personAH again in
personAH
Teelabs Yes Yes No Personally No Yes
PersonallyAH visit site
visit and
site and capture
capture in again
personAH
iVR (ours) Yes Remotely Automatic Yes Near Yes Yes
automatic data real-time
processing every 17
min

578 There areapplied research is deprivedAH some limitations in this researchAH . For instance, all
579 data need to be readily available to feed into the system prior to using iVR. Next, in the iVR-inspect
580 module, data overload occasionally occurs when using VR mode for inspection, which causes the
581 system to crash. Therefore, we advise reducing the BIM model details before overlaying it with the
582 as-built model inside the VR environment to prevent data overload.
583 In addition, the iVR-scan module’s point cloud capturing device contains limitations in scanning
584 highly reflective items (such as mirrors and highly polished surfaces), transparent items (such as glass
585 and liquids), and items that refract and reflect light (such as number plates and reflective panels on
586 clothing). Moreover,AH objects that typically do not reflect light may do so under certain conditions
587 (such as the paint coating of cars and stainless steel). Hence, additional scans will be required to obtain
588 sufficient information regarding such objects. Besides, iVR-scan module works well for indoor data
589 acquisition but does not perform well in the outdoor environment due to strong sunlight lamination.
Version February 6, 2021 submitted to Journal Not Specified 21 of 23

590 Therefore, Kinect devicesiVR-scan can replaceAH could be replaced by other data capturing devices
591 easily for outdoor monitoring purposes because iVR-scan was built using visual programming, which
592 makes it easy to host other data capturing devices with a minor change to the code.
593 Moreover, each iVR module has certain input and output requirements that must beAH performed
594 correctly. Therefore, when some data are not available or have been corrupted, the system will not
595 work properly, and the inspector will need to debug it. Finally, in this research, iVR-prepare is used
596 to target indoor activities with small objects that can be observed in the FOV of one Kinect camera
597 and convert its captured point cloud to 3D mesh. Therefore, point cloud density and geometry quality
598 need to be optimized and set prior to the monitoring stage to prevent data overload that might cause
599 failure in data transfer between the construction site and office. Therefore, although iVR-prepare works
600 best when it transfers geometry data of objects indoor activities such as column finishing work and
601 HVAC pipes successfully from the construction siteAH to the office, it is currently unable to handle big
602 geometry data such as a full building or multiple workspaces at the same time, and so large jobsitAH
603 need to be separated, each with its own monitoring station, to prevent data overload.

604 6. Conclusion
605 Although there have been significant developments in 3D scanning and photogrammetry
606 technologies for construction work inspection, existing progress tracking methods still require manual
607 intervention or data processing. This significantly increases the time required for monitoring and
608 relies on conventional methods. To address this issue, we developed and tested iVR, a near real-time
609 construction project monitoring tool. Based on the findings, the significant benefits of iVR are as
610 follows.

611 • The results indicate that the developed tool amplified the potential toAH bridges the gap between
612 the construction office and site by only capturing the necessary data and exchanging it to decrease
613 computational complexity and time consumption.
614 • Immersive VR and AR help the inspector and workers to visualize the activity from different
615 angles, which cannot be achieved via conventional methods, and provides the necessary tools
616 for drawing, commenting, attaching data, and communicating faster without requiring physical
617 interaction.
618 • Using iVR, inspectors can monitor various activities quickly without physically leaving the
619 construction office. This can have valuable implications on the nature of work inspection, reduce
620 human resources, increase the automation of monitoring activities, and increase quality and
621 production, which will reduce monitoring costs.
622 • Another contribution of the proposed approach is that it facilitates social distancing and reduces
623 physical interaction between the workers and the site engineer; this should help curb the spread
624 of COVID-19 and create a healthier and safer environment at the construction site.

625 In summary, the significant potential of iVR in work inspection at the construction stage was
626 ascertained and confirmed using two laboratory case studies. In the future, this tool can be developed
627 as a tab plugin to commercial software applications, thereby enhancing the entire inspection process.
628 However, the architecture of the present methodology is limited to the geometry of the activity, such as
629 built-in work inspection. HenceAH Subsequent researchAH should be aimed at expanding the proposed
630 model to include progress tracking (photogrammetry and real 3D scanners), material tracking (GPS
631 and RFID), worker tracking (RFID), and equipment tracking (GPS and distributed sensors). Although
632 iVR-scan only scans a job-site target activity, it could be extended to focus on object quality assessments,
633 human activities, and human recognition and skeletonizing to identify human figures and track the
634 skeletal images of people moving within the Kinect FOV. In the future, this system will be implemented
635 and tested on actual construction sites to elucidate its potential advantages and challenges. Moreover,
636 we anticipate that iVR can become a communication platform between the QA/QC engineer, the owner,
637 and the general contractor during the construction stage of the project by utilizing XR integrated with
Version February 6, 2021 submitted to Journal Not Specified 22 of 23

638 3D laser scanning to increase automation and efficiency as the key toward sustainability in construction.
639 Furthermore, in iVR-prepare, the generated reports could be integrated with the targeted activity
640 reports of the BIM schedule and therebyAH provide feedback for the self-update to 4D BIM, thereby
641 enhancing automation in QA inspection.

642 Author Contributions: “conceptualization, A.K. and C.P.; methodology, O.L.; software, A.K.; validation, C.P., D.L.;
643 formal analysis, C.P.; investigation, O.L.; resources, O.L.; data curation, C.P.; writing–original draft preparation,
644 A.K.; writing–review and editing, C.P.; visualization, A.K.; supervision, C.P.; project administration, O.L.; funding
645 acquisition, O.L.”
646 Acknowledgments: This work is supported by the Korea Agency for Infrastructure Technology Advancement
647 (KAIA) grant funded by the Ministry of Land, Infrastructure, and Transport (Grant 20SMIP-A158708-01). This
648 research was also supported by Haenglim Architecture and Engineering Company. The first author would like to
649 thank Chung Ang University and the Korean Government Scholarship program.
650 Conflicts of Interest: “The authors declare no conflict of interest.”

651 References
652 1. Forcael, E.; Ferrari, I.; Opazo-Vega, A.; Pulido-Arcas, J.A. Construction 4.0: A Literature Review.
653 Sustainability 2020, 12. doi:10.3390/su12229755.
654 2. Alizadehsalehi, S.; Yitmen, I. A concept for automated construction progress monitoring: technologies
655 adoption for benchmarking project performance control. Arabian Journal for Science and Engineering 2019,
656 44, 4993–5008.
657 3. Braun, A.; Tuttas, S.; Borrmann, A.; Stilla, U. Improving progress monitoring by fusing point clouds,
658 semantic data and computer vision. Automation in Construction 2020, 116, 103210.
659 4. Kim, S.; Kim, S.; Lee, D.E. Sustainable Application of Hybrid Point Cloud and BIM Method for Tracking
660 Construction Progress. Sustainability 2020, 12, 4106.
661 5. Shen, W.; Shen, Q.; Sun, Q. Building Information Modeling-based user activity simulation and evaluation
662 method for improving designer–user communications. Automation in Construction 2012, 21, 148–160.
663 6. Srour, I.M.; Abdul-Malak, M.A.U.; Yassine, A.A.; Ramadan, M. A methodology for scheduling overlapped
664 design activities based on dependency information. Automation in Construction 2013, 29, 1–11.
665 7. Chi, S.; Caldas, C.H. Image-based safety assessment: automated spatial safety risk identification of
666 earthmoving and surface mining activities. Journal of Construction Engineering and Management 2012,
667 138, 341–351.
668 8. Parhi, D.R.; Choudhury, S. Analysis of smart crack detection methodologies in various structures. Journal
669 of Engineering and Technology Research 2011, 3, 139–147.
670 9. Mahami, H.; Nasirzadeh, F.; Hosseininaveh Ahmadabadian, A.; Nahavandi, S. Automated progress
671 controlling and monitoring using daily site images and building information modelling. Buildings 2019,
672 9, 70.
673 10. Scott, S.; Assadi, S. A survey of the site records kept by construction supervisors. Construction Management
674 & Economics 1999, 17, 375–382.
675 11. Pučko, Z.; Šuman, N.; Rebolj, D. Automated continuous construction progress monitoring using multiple
676 workplace real time 3D scans. Advanced Engineering Informatics 2018, 38, 27–40.
677 12. Azhar, S.; Ahmad, I.; Sein, M.K. Action research as a proactive research method for construction engineering
678 and management. Journal of Construction Engineering and Management 2010, 136, 87–98.
679 13. Assaad, R.; El-adaway, I.H.; El Hakea, A.H.; Parker, M.J.; Henderson, T.I.; Salvo, C.R.; Ahmed, M.O.
680 Contractual Perspective for BIM Utilization in US Construction Projects. Journal of Construction Engineering
681 and Management 2020, 146, 04020128.
682 14. Albert, A.; Hallowell, M.R.; Kleiner, B.; Chen, A.; Golparvar-Fard, M. Enhancing construction hazard
683 recognition with high-fidelity augmented virtuality. Journal of Construction Engineering and Management
684 2014, 140, 04014024.
685 15. Porwal, A.; Hewage, K.N. Building Information Modeling (BIM) partnering framework for public
686 construction projects. Automation in construction 2013, 31, 204–214.
687 16. Xu, J.; Ding, L.; Love, P.E. Digital reproduction of historical building ornamental components: From 3D
688 scanning to 3D printing. Automation in Construction 2017, 76, 85–96.
Version February 6, 2021 submitted to Journal Not Specified 23 of 23

689 17. Jin, R.; Zuo, J.; Hong, J. Scientometric review of articles published in ASCE’s journal of construction
690 engineering and management from 2000 to 2018. Journal of Construction Engineering and Management 2019,
691 145, 06019001.
692 18. Ali, A.K.; Lee, O.J.; Park, C. Near Real-Time Monitoring of Construction Progress: Integration of Extended
693 Reality and Kinect V2. Proceedings of the 37th International Symposium on Automation and Robotics in
694 Construction (ISARC); International Association for Automation and Robotics in Construction (IAARC):
695 Kitakyushu, Japan, 2020; pp. 24–31. doi:10.22260/ISARC2020/0005.
696 19. Riveiro, B.; Lourenço, P.B.; Oliveira, D.V.; González-Jorge, H.; Arias, P. Automatic morphologic analysis
697 of quasi-periodic masonry walls from LiDAR. Computer-Aided Civil and Infrastructure Engineering 2016,
698 31, 305–319.
699 20. Freimuth, H.; König, M. A framework for automated acquisition and processing of As-built data with
700 autonomous unmanned aerial vehicles. Sensors 2019, 19, 4513.
701 21. Liu, J.; Li, D.; Feng, L.; Liu, P.; Wu, W. Towards Automatic Segmentation and Recognition of Multiple
702 Precast Concrete Elements in Outdoor Laser Scan Data. Remote Sensing 2019, 11, 1383.
703 22. Kim, D.; Kwon, S.; Cho, C.S.; García de Soto, B.; Moon, D. Automatic Space Analysis Using Laser Scanning
704 and a 3D Grid: To Industrial Plant Facilities. Sustainability 2020, 12, 9087.
705 23. Weerasinghe, I.T.; Ruwanpura, J.Y.; Boyd, J.E.; Habib, A.F. Application of Microsoft Kinect sensor for
706 tracking construction workers. Construction Research Congress 2012: Construction Challenges in a Flat
707 World, 2012, pp. 858–867.
708 24. Ray, S.J.; Teizer, J. Real-time construction worker posture analysis for ergonomics training. Advanced
709 Engineering Informatics 2012, 26, 439–455.
710 25. Ali, A.K.; Lee, O.J.; Song, H. Robot-based facade spatial assembly optimization. Journal of Building
711 Engineering 2020, p. 101556.
712 26. Li, Y.; Liu, C. Applications of multirotor drone technologies in construction management. International
713 Journal of Construction Management 2019, 19, 401–412.
714 27. Kaneko, K.; Kaminaga, H.; Sakaguchi, T.; Kajita, S.; Morisawa, M.; Kumagai, I.; Kanehiro, F. Humanoid
715 Robot HRP-5P: An electrically actuated humanoid robot with high-power and wide-range joints. IEEE
716 Robotics and Automation Letters 2019, 4, 1431–1438.
717 28. Rizo-Maestre, C.; González-Avilés, Á.; Galiano-Garrigós, A.; Andújar-Montoya, M.D.; Puchol-García,
718 J.A. UAV BIM: Incorporation of Photogrammetric Techniques in Architectural Projects with Building
719 Information Modeling versus Classical Work Processes+. Remote Sensing 2020, 12, 2329.
720 29. Bellés, C.; Pla*, F. A Kinect-Based System for 3D Reconstruction of Sewer Manholes. Computer-Aided Civil
721 and Infrastructure Engineering 2015, 30, 906–917.
722 30. Cheng, L.K.; Chieng, M.H.; Chieng, W.H. Measuring virtual experience in a three-dimensional virtual
723 reality interactive simulator environment: a structural equation modeling approach. Virtual Reality 2014,
724 18, 173–188.
725 31. Kamat, V.R.; El-Tawil, S. Evaluation of augmented reality for rapid assessment of earthquake-induced
726 building damage. Journal of computing in civil engineering 2007, 21, 303–310.
727 32. Shirazi, A.; Behzadan, A.H. Design and assessment of a mobile augmented reality-based information
728 delivery tool for construction and civil engineering curriculum. Journal of Professional Issues in Engineering
729 Education and Practice 2015, 141, 04014012.
730 33. Jo, B.W.; Lee, Y.S.; Kim, J.H.; Kim, D.K.; Choi, P.H. Proximity warning and excavator control system for
731 prevention of collision accidents. Sustainability 2017, 9, 1488.
732 34. Cai, H.; Andoh, A.R.; Su, X.; Li, S. A boundary condition based algorithm for locating construction site
733 objects using RFID and GPS. Advanced Engineering Informatics 2014, 28, 455–468.
734 35. Yang, K.; Aria, S.; Ahn, C.R.; Stentz, T.L. Automated detection of near-miss fall incidents in iron workers
735 using inertial measurement units. Construction Research Congress 2014: Construction in a Global Network,
736 2014, pp. 935–944.

737 © 2021 by the authors. Submitted to Journal Not Specified for possible open access publication
738 under the terms and conditions of the Creative Commons Attribution (CC BY) license
739 (http://creativecommons.org/licenses/by/4.0/).

You might also like