Professional Documents
Culture Documents
Exercise
Process drone data to model a
construction project
Section 3 Exercise 1
August 13, 2021
Imagery in Action
Time to complete
55 minutes
Software requirements
ArcGIS Drone2Map license
Introduction
ArcGIS is capable of using imagery from many different sources, including unmanned aerial
vehicles (UAV) or drones. With these new sensors, high-resolution imagery can be captured
and quickly added to your GIS to provide an updated visual look at your study area or for use
in more advanced analytical studies. Several Esri products can help manage your drone data
in your organization, including Site Scan for ArcGIS, ArcGIS Ortho Maker, and ArcGIS
Drone2Map. Site Scan provides the ability to program your drone's flight plan and then create
imagery products in a cloud-based workflow. Ortho Maker is an ArcGIS Enterprise solution
through ArcGIS Image Server. Drone2Map provides a desktop-based workflow that can be
used to create new imagery products based on drone data that you create. Additionally,
ArcGIS Pro's ortho mapping capability can be used to create orthorectified imagery.
Depending on the origin of the input imagery and the organization requirements, the choice
of which application to use can vary for each organization. For this exercise, you will focus on
Drone2Map as the app for your organization.
Exercise scenario
In this exercise, you have been given a fresh set of imagery that was collected from a drone
for a recent residential development. You are a GIS analyst with the company developing the
neighborhood. You have been asked to use this new data to create imagery products that will
be used to show the city the progress on the new development and for the architects to
inspect the progress on the neighborhood. You will use Drone2Map to create the imagery
products and use one of the options to create the imagery products for further analysis and
visualization.
b Browse to ArcGIS Drone2Map Help: Turn Your Drone into an Enterprise Productivity Tool
(https://doc.arcgis.com/en/drone2map) and sign in with your course credentials.
ArcGIS Drone2Map requires 1.8 gigabytes of disk space to download. If the default download
location does not have enough space, you can change the location by following the steps in
this How to Change the File Download Location in Your Browser article (https://links.esri.com/
ChangeDownloadLocation | www.lifewire.com/change-the-file-download-location-4046428).
e Follow the installation instructions, accept the Master Agreement, and then accept the
rest of the defaults.
f When you are finished, close the private or incognito web browser.
The size of this dataset is 333 MB. Be sure that your computer has enough space
to download the data.
c When you are finished, close the web browser and File Explorer, if necessary.
c In the middle of the Drone2Map Start page, click Next, as specified in the following
graphic.
Note: You may need to expand the ArcGIS Drone2Map window to see the Next button.
e For the project location, click Browse, and then browse to C:\EsriMOOC, click the Projects
folder to select it, and then click OK.
g Browse to C:\EsriMOOC\Data, select the SubdivisionDrone folder, and then click OK.
h Click Create.
In Drone2Map, you will see 2D and 3D maps added to the display. The flight line pattern in
the 2D map will be visible based on the arrangement of the input data.
i Review the Contents pane on the left, which shows the layers that are currently added to
the map.
1. What do the blue dots represent?
_______________________________________________________________________________
j In the map, in the third column from the left, click the top blue dot for image DJI_0425, as
shown in the following graphic.
The Image Viewer will appear in a new window showing the individual image for your review.
You can scroll through the images added to the project. The purpose of the Image Viewer is
to allow for review of the input images before the imagery products are created. Some
images may not be satisfactory for your project, so you can remove those images from the
processing here. You can also mark up images with notes based on your review.
a On the Home tab, in the Processing group, click Options to open the options for the
project.
In the Options dialog box, you can choose the products that you want to create and the
parameters to use during the processing.
b On the 2D Products tab, confirm that the Create Orthomosaic, Create Digital Surface
Model, and Create Digital Terrain Model boxes are checked.
c On the 3D Products tab, for Create Point Clouds, check the LAS box.
The LAS point cloud is a set of points that represents coincident locations in the project area
where several pixels from various input rasters are the same. These keypoints are then used to
create a point cloud that can be used to model different elevation imagery products.
d For Create Textured Meshes, under Multi LOD Mesh, check the SLPK box.
The textured mesh option will create an object that can be viewed in three dimensions. The
mesh can be used to model what the project area looks like as if you were on the ground
looking around at the features.
Your 3D Products tab should look like the following graphic.
f For Matching Image Pairs, confirm that Aerial Grid Or Corridor is selected.
This option is based on the input data. When the drone collects the data, different flight plans
may be used. For this set of input rasters, this option is correct.
Your Initial tab should look like the following graphic.
h Click OK.
For more information about the specific selections, see ArcGIS Drone2Map Help: Processing
options.
a On the Home tab, in the Control group, click the Control down arrow and choose Import
Control.
b In the Import Control window, choose the Import From Drone2Map Control Export
option, and then click OK.
c In the Import Control dialog box, for Drone2Map Control Export, click Browse, and then
browse to C:\EsriMOOC\Data, select the GCP_Subdivision.zip file, and click OK.
The new GCPs will appear in the map as green plus symbols.
GCPs are important for drone data to aid in improving the accuracy of the data collected. As
the quality of the ground control improves, the positional accuracy of the created imagery
products improves.
In this step, you indicated the products that you want to create, added GCPs, and configured
the parameters to use during the creation.
a On the Home tab, in the Processing group, click Start to create the imagery products.
Note: The progress for the project will be indicated at the bottom of the Manage pane on the
right. Depending on your system, the processing times can be lengthy.
The 2D map includes the project data about the images and the flight lines as before, but
now it also includes the 2D products and 3D products created.
b In the Contents pane, turn off the visibility of the Project Data and 3D Products group
layers.
The 2D imagery products are displayed in two group layers: the Imagery Products group
layer, which contains the Orthomosaic layer, and the DEM Products group layer, which
contains the Digital Surface Model (DSM) and Digital Terrain Model (DTM) layers.
a On the Home tab, in the Processing group, click Report to open the Processing Report.
The Processing Report includes different statistics about the production process and some
statistics about the product generated.
Ground sampling distance (GSD) is the distance between center points of each pixel. The
GSD is the size of the pixels on the ground in the orthomosaic. The GSD is related to the
spatial resolution for the output.
2. In the Summary section, what is the average GSD reported?
_______________________________________________________________________________
When considering how well the processing project performed, one measure is if all the
images used as input have been used to create the imagery products.
3. In the Quality Check section, how many images were calibrated in the project?
_______________________________________________________________________________
c In the Contents pane, in the Imagery Products group layer, double-click the Orthomosaic
layer to open its properties.
_______________________________________________________________________________
g Scroll down to the Overlap section of the report, and then consider the green area
depicted in the graphic.
The graphic indicates the coverage area in the project relative to the number of overlapping
images. The green areas have the most coverage, which improves the quality of the output in
that area when you have that much overlap. As you review the output, consider the areas in
the center of the imagery product to be more accurate than the areas with less input.
a In the Contents pane, right-click Orthomosaic and choose Zoom To Source Resolution.
b Pan the display to the right until you see the cul-de-sac shown in the following graphic.
Drone imagery data is collected at a lower altitude and is capable of creating high-resolution
imagery.
5. In the bottom-left corner of the map, what is the reported scale?
_______________________________________________________________________________
At this scale, you can see many different features very well. This orthomosaic will work well for
construction managers to visually inspect the area for defects or other errors.
total area covered by the driveway created can be measured and provided to the
homeowner.
d Measure the area of the driveway indicated in the following graphic by clicking the outline
of the driveway.
Note: If you need to pan the map, hold the C key to temporarily activate the Explore tool.
_______________________________________________________________________________
f In the Mensuration Results pane, select the measurement, click the Delete button , and
then click Yes to delete the results.
a In the Contents pane, turn on the visibility of the 3D Products group to see the LAS Point
Cloud layer.
Each LAS point has been classified to allow you to create multiple elevation models and allow
the use of the LAS point cloud for other lidar-related products in ArcGIS, including a LAS
dataset. The classification is performed during the creation process to indicate what the
individual point is displaying.
For more information about LAS datasets, see ArcGIS Pro Help: What is a LAS dataset?
c In the Contents pane, turn off the visibility of the Orthomosaic layer and the DEM
Products group layer.
d On the Home tab, in the Layers group, click Basemap and choose Dark Gray Canvas.
The point cloud is based on the LAS file and is classified according to a preset classification
scheme.
e In the Contents pane, open the LAS Point Cloud layer properties.
f In the Layer Properties dialog box, click the LAS Filter tab.
You can see the classification codes, return values, and classification flags on this tab. Each
point in the LAS Point Cloud layer is classified during the creation process. These
classifications allow you to filter the point cloud to show portions of the data depending on
your desired output. The classification codes denote what the point represents.
Note: These codes are generated by the tool and may need to be manually refined further if
errors are noticed.
8. What are the classification codes reported in the dialog box?
_______________________________________________________________________________
h In the Contents pane, turn on the visibility of the DEM Products group layer and turn off
the visibility of the 3D Products group layer.
The DSM is created from the LAS point cloud and indicates the surface elevation throughout
the raster. The buildings, trees, and other surface features are visible in this visualization. The
values in this raster can be used for measuring the height of features.
i Turn off the visibility of the DEM Products group layer and turn on the visibility of the 3D
Products group layer.
j Right-click LAS Point Cloud, point to LAS Filter, and choose Ground.
The points in the LAS point cloud with the Ground option enabled removes the classified LAS
points that are not classified as Ground. This version of the point cloud was used to create the
digital terrain model (DTM) to visualize the "bare earth" in the project.
l Turn on the DEM Products group layer and turn off the visibility of the Digital Surface
Model.
The DTM visualizes the ground elevation, or "bare earth" visualization, of the drone data. This
visualization models what the area would look like if the features were removed. The
buildings, trees, and other surface features are removed and only the ground is displayed.
When the LAS point cloud is created, each point is classified according to a classification
scheme that indicates whether the LAS point is the ground or another feature.
This visualization is an estimate of the "bare earth" and should not be considered
more than an estimation.
a At the top of the map view, click the 3D Map tab to view the data in three dimensions.
b In the Contents pane, in the 3D Products group layer, right-click the Mesh layer and
choose Zoom To Layer.
d Rotate the scene until the flight lines are aligned, as shown in the following graphic.
Note: The 3D mesh results can be improved by increasing the LOD Texture Quality and
adding more oblique input data to improve the sides of the buildings.
With the flight lines visible, you can see the position of the drone during the collection of the
imagery.
f If you would like to further engineer your data, continue exploring the output from ArcGIS
Drone2Map in the following stretch goal.
To continue engineering your data and exploring the output from ArcGIS Drone2Map,
perform the following tasks:
• What is the average height of the tallest model home in the scene?
• Which house has the largest fenced-in yard?
• When the shadows are enabled, which houses have shaded backyards on August 25,
2021, at 4:00 PM, Central Time (UTC-6 CST - US and Canada), which is the time zone
that this Texas neighborhood is in?
• Which house has the driveway with the most area, and what is the measurement?
For more information about Scene Viewer, see ArcGIS Online Help: Get started with Scene
Viewer and Scene navigation.
Use the Lesson Forum to post your questions, observations, and screen capture examples to
identify which homes best represent the answers. Be sure to include the #stretch hashtag in
the posting title.
3. In the Quality Check section, how many images were calibrated in the project?
There were 40 images calibrated, which are all the images in the folder.