Professional Documents
Culture Documents
Image Analysis For Arcgis: Geographic Imaging by Erdas December 2010
Image Analysis For Arcgis: Geographic Imaging by Erdas December 2010
Table of Contents
Table of Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . v
List of Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .xi
List of Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiii
Foreword . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Introducing Image Analysis for ArcGIS . . . . . . . . . . . . . . . . . . . . . . . . . 3
Performing Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
Updating Databases . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4
Categorizing Land Cover and Characterizing Sites . . . . . . . . . . . . . . . . . . . . . . . . . .5
Identifying and Summarizing Natural Hazard Damage . . . . . . . . . . . . . . . . . . . . . . .6
Identifying and Monitoring Urban Growth and Changes . . . . . . . . . . . . . . . . . . . . . .7
Extracting Features Automatically . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .8
Assessing Vegetation Stress . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .9
Learning More . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Finding Answers to Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .9
Getting Help on Your Computer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Contacting ERDAS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Contacting ESRI . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
ERDAS Education Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
ESRI Education Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Quick-Start Tutorial . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Start Image Analysis for ArcGIS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
Add the Image Analysis for ArcGIS Extension . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
Add Toolbars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Exercise 2: Using Histogram Stretch . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
Add a Theme of Moscow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
Apply a Histogram Equalization in the View . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Apply a Histogram Equalization to modify a file . . . . . . . . . . . . . . . . . . . . . . . . . . . .16
Apply an Invert Stretch to the Moscow Image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
Exercise 3: Identifying Similar Areas . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Add and Draw a Theme Depicting an Oil Spill . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Create a Shapefile . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
Draw the Polygon with the Seed Tool . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
Exercise 4: Finding Changed Areas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
Table of Contents
Table of Contents
v
v
vi
Table of Contents
Table of Contents
vii
viii
Table of Contents
Table of Contents
ix
Table of Contents
Glossary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 207
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231
Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 235
Table of Contents
xi
List of Figures
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
List of Figures
List of Figures
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..4
..5
..6
..7
..8
..9
. 52
. 59
. 60
. 61
. 62
. 65
. 74
. 75
. 75
. 76
. 76
. 88
. 90
. 90
. 92
. 93
. 96
100
106
108
108
112
120
120
127
127
128
129
132
132
xi
xi
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
xii
37:
38:
39:
40:
41:
42:
43:
44:
45:
46:
47:
48:
49:
50:
51:
52:
53:
54:
55:
....
....
....
....
....
....
....
....
....
....
....
....
....
Tab .
....
....
....
....
....
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
138
138
141
141
143
163
163
174
174
177
191
192
193
197
198
199
200
203
205
List of Figures
List of Tables
Table
Table
Table
Table
Table
Table
Table
Table
Table
Table
Table
Table
List of Tables
List of Tables
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
. 63
. 64
. 65
. 72
118
177
178
187
195
196
202
204
xiii
xiii
xiv
List of Tables
Foreword
An image of the Earths surface is a wealth of information. Images
capture a permanent record of buildings, roads, rivers, trees, schools,
mountains, and other features located on the Earths surface. But
images go beyond simply recording features. Image serve the following
purposes:
Chronicle our Earth and everything associated with it; they record a
specific place at a specific point in time. They are snapshots of our
changing cities, rivers, and mountains.
Foreword
Foreword
1
1
On behalf of the Image Analysis for ArcGIS and Stereo Analyst for
ArcGIS product teams, we wish you all the best in working with these
products and hope you are successful in your GIS and mapping
endeavors.
Foreword
Performing Tasks
Learning More
3
3
Performing Tasks
Updating Databases
Image Analysis for ArcGIS lets you perform many tasks, including:
Updating Databases
Identifying and
Summarizing Natural
Hazard Damage
When viewing a forest hit by a hurricane, you can use the mapping tools
of Image Analysis for ArcGIS to show where the damage occurred. With
other tools, you can show the condition of the vegetation, how much
stress it suffers, and how much damage it sustained in the hurricane.
Below, Landsat images taken before Hurricane Hugo in 1987 and after
Hurricane Hugo in 1989, in conjunction with a shapefile that identifies
the forest boundary, are used for comparison. Within the shapefile, you
can see detailed tree stand inventory and management information.
Figure 3: Before and After Hurricane Hugo, and Shapefile
Identifying and
Monitoring Urban Growth
and Changes
Cities grow over time, and images give a good sense of how they grow
and how to preserve remaining land by managing that growth. You can
use Image Analysis for ArcGIS to reveal patterns of urban growth over
time.
Here, Landsat data spanning 21 years was analyzed for urban growth.
The yellow urban areas from 1994 represent how much the city has
grown beyond the red urban areas from 1973. The final view shows the
differences in extent of urban land use and land cover between 1973
and 1994. Those differences are represented as classes.
Figure 4: Urban Areas Represented in Red
Extracting Features
Automatically
Suppose you are responsible for mapping the extent of an oil spill as
part of a rapid response effort. You can use synthetic aperture radar
(SAR) data and Image Analysis for ArcGIS tools to identify and map the
extent of such environmental hazards.
The following image shows an oil spill off the northern coast of Spain.
The first image shows the spill, and the second image shows a polygon
grown in the oil spill using the Seed tool. The second image gives you
an example of how you can isolate the exact extent of a particular
pattern using Image Analysis for ArcGIS.
Figure 5: Oil Spill and Polygon Grown in Spill
Assessing Vegetation
Stress
Learning More
If you are just learning about GIS, you may want to read the following
books about ArcCatalog and ArcMap: Using ArcCatalog and Using
ArcMap. Knowing about these applications can make your use of Image
Analysis for ArcGIS much easier.
See the quick-start tutorial in Quick-Start Tutorial on page 11 if you
are ready to learn about how Image Analysis for ArcGIS works. In this
tutorial, you learn how to adjust the appearance of an image, how to
identify similar areas of an image, how to align an image to a feature
theme, find areas of change, and mosaic images. The tutorial is written
so that you can do the exercises using your computer and the example
data supplied with Image Analysis for ArcGIS. If youd rather, you can
just read the tutorial to learn about the functionality of Image Analysis
for ArcGIS.
Finding Answers to
Questions
10
You can get a lot of information about the features of Image Analysis for
ArcGIS by accessing the online help. To browse the online help
contents, select Image Analysis desktop Help from the Image Analysis
dropdown list. From this point you can use the table of contents, index,
or search feature to locate the information you need. If you need online
help for ArcGIS, click Help on the ArcMap toolbar and select ArcGIS
Desktop Help.
Contacting ERDAS
You can contact ERDAS for technical support, if needed, at 770-7763650. Customers outside the United States should contact their local
distributor. Visit ERDAS on the Web at www.erdas.com.
Contacting ESRI
ERDAS Education
Solutions
ESRI Education
Solutions
Quick-Start Tutorial
Now that you know a little about the Image Analysis for ArcGIS
extension and its potential applications, the following exercises give you
hands-on experience in using many of the extensions tools.
In Image Analysis for ArcGIS, you can quickly identify areas with similar
characteristics. This is useful in cases such as environmental disasters,
burn areas, or oil spills. Once an area is defined, it can also be quickly
saved into a shapefile, eliminating the need for manual digitizing.
This tutorial shows you how to use some of the Image Analysis for
ArcGIS tools and gives you a good introduction to using Image Analysis
for ArcGIS for your own GIS needs.
IN THIS CHAPTER
Quick-Start Tutorial
Quick-Start Tutorial
Whats Next?
11
11
Exercise 1: Getting
Started
In this exercise, you learn how to start Image Analysis for ArcGIS and
activate the toolbar associated with it. You gain access to all the
important Image Analysis for ArcGIS features through its toolbar and
menu list. After completing this exercise, youll be able to locate any
Image Analysis for ArcGIS tool you need for preparation, enhancement,
analysis, or geocorrection.
This exercise assumes you have successfully installed Image Analysis
for ArcGIS on your computer. You must use a single or dual monitor
workstation that is configured for use with ArcMap and Image Analysis
for ArcGIS.
If you have not installed Image Analysis for ArcGIS, refer to the
installation guide packaged on the CD and install it now.
To add the Image Analysis for ArcGIS extension, follow these steps:
1. When the ArcMap dialog opens, keep the option to create a new empty
map and then click OK to open ArcMap.
2. Select Extensions from the Tools menu to open the Extensions dialog.
12
Quick-Start Tutorial
3. Check the Image Analysis check box to add the extension to ArcMap.
Add Toolbars
The Image Analysis toolbar is your gateway to many of the tools and
features you can use with the extension. Use it to choose different
analysis types, select a geocorrection type, or set links in an image,
among other things.
To add the Image Analysis toolbar, follow this step:
1. Click the Customize menu, point to Toolbars, and then select Image
Analysis to add the Image Analysis toolbar.
Quick-Start Tutorial
13
Exercise 2: Using
Histogram Stretch
3
4
14
Quick-Start Tutorial
Apply a Histogram
Equalization in the View
Quick-Start Tutorial
15
Apply a Histogram
Equalization to modify a
file
You can apply the changes you made to a copy of the file permanently
using Image Analysis for ArcGIS.
To apply a histogram equalization to modify a file, follow these steps:
1. Click the Image Analysis dropdown arrow, point to Radiometric
Enhancement, and then select Histogram Equalization to open the
Histogram Equalization dialog.
2
3
4
16
Quick-Start Tutorial
2
4
2. Click the Histograms button in the Stretch box to view the histograms.
3. Click OK to return to the Symbology tab.
4. Check the Invert check box.
5. Click Apply and OK to display the inverted image.
Quick-Start Tutorial
17
18
Quick-Start Tutorial
Exercise 3:
Identifying Similar
Areas
With Image Analysis for ArcGIS, you can quickly identify areas with
similar characteristics in images. This is useful for identifying
environmental disasters or burn areas.
To add and draw an Image Analysis for ArcGIS theme depicting an oil
spill, follow these steps:
Once an area is defined, you can save it into a shapefile. This lets you
avoid the need for manual digitizing. To define the area, use the Seed
tool to point to an area of interest such as a dark area on an image
depicting an oil spill. The Seed tool returns a graphic polygon outlining
areas with similar characteristics.
1. If you are starting immediately after the previous exercise, clear your
view by clicking the New Map File button on the ArcMap toolbar. You
do not need to save the changes. If you are beginning here, start
ArcMap and load the Image Analysis for ArcGIS extension.
1
Quick-Start Tutorial
19
Create a Shapefile
In this exercise, you use the Seed tool. The Seed tool grows a polygon
graphic in the image that encompasses all similar and contiguous
areas. However, you must first create a shapefile in ArcCatalog and
start editing to activate the Seed tool. After going through these steps,
click inside the area you want to highlight, in this case an oil spill, and
create a polygon. The polygon lets you see how much area the oil spill
covers.
To create a shapefile using the Seed tool, follow these steps:
1. Click the Zoom In button on the Tools toolbar, and then drag a rectangle
around the black area in the image to see the spill more clearly.
1
2. Click the ArcCatalog button. You can store the shapefile you create in
the example data directory or navigate to a different directory.
2
3. Select the directory in the ArcCatalog table of contents, click File, point
to New, and then select Shapefile to open the Create New ShapeFile
dialog.
4
5
11
4. Type a name for the new shapefile oilspill in the Name field.
5. Select Polygon from the Feature Type dropdown list.
20
Quick-Start Tutorial
10
8. Click the Import button to open the Browse for Dataset dialog, which
contains the example data directory.
9. Select radar_oilspill.img, and then click Add to return to the Spatial
Reference Properties dialog.
10. Click Apply and OK to return to the Create New Shapefile dialog.
11. Click OK to return to the ArcCatalog window.
12. Select the oilspill shapefile, and then drag it into the table of contents in
the ArcMap window.
13. Close ArcCatalog.
Quick-Start Tutorial
21
To draw the polygon with the Seed tool, follow these steps:
1. Select Seed Tool Properties from the Image Analysis dropdown list to
open the Seed Tool Properties dialog.
2
3
22
Quick-Start Tutorial
Note: The Seed tool takes a few moments to produce the polygon.
This is a polygon of an oil spill grown by the Seed tool.
Note: If you dont automatically see the formed polygon in the image,
click the refresh button at the bottom of the ArcMap window.
You can see how the tool identifies the extent of the spill. An emergency
team could be informed of the extent of this disaster to effectively plan
a cleanup of the oil.
Quick-Start Tutorial
23
Exercise 4:
Finding Changed
Areas
The Image Analysis for ArcGIS extension lets you see changes over
time. You can perform this type of analysis on either continuous data
using image difference or thematic data using thematic change.
In the following example, you work with two continuous data images of
the north metropolitan Atlanta, Georgia areaone from 1987 and one
from 1992. Continuous data images are those obtained from remote
sensors like Landsat and SPOT. This kind of data measures reflectance
characteristics of the Earths surface, analogous to exposed film
capturing an image. You can use image difference to identify areas that
have been cleared of vegetation to construct a large regional shopping
mall.
3. Hold down the Ctrl key, and then select both atl_spotp_87.img and
atl_spotp_92.img.
4. Click Add to display the images in your view.
With images active in the view, you can calculate the difference
between them.
24
Quick-Start Tutorial
In this exercise, you learn how to use image difference, which is useful
for analyzing images of the same area to identify any changes in land
cover features over time. Image difference performs a subtraction of
one theme from another. This change is highlighted in green and red
masks that depict increasing and decreasing values.
To compute the difference due to development, follow these steps:
1. Click the Image Analysis dropdown arrow, point to Utilities, and then
select Image Difference to open the Image Difference dialog.
2
3
4
5
6
7
8
9
10
11
2. Click the browse button for the Before Theme field and navigate to the
file named atl_spotp_87.img.
3. Select Layer_1 from the Before Layer dropdown list.
4. Click the browse button for the After Image field and navigate to the file
named atl_spotp_92.img.
5. Select Layer_1 from the After Layer dropdown list.
6. Click the As Percent button in the Highlight Changes box.
Quick-Start Tutorial
25
26
Quick-Start Tutorial
12. Uncheck the Difference Image check box in the ArcMap table of
contents to disable it and display the highlight image.
You can now clear the view and either go to the next portion of this
exercise, Exercise 4: Finding Changed Areas on page 27, or end the
session by closing ArcMap. If you want to shut down ArcMap with
Image Analysis for ArcGIS, click the File menu and select Exit. Click No
when asked to save changes.
Quick-Start Tutorial
27
1. If you are starting immediately after the previous exercise, clear your
view by clicking the New Map File button on your ArcMap toolbar. You
do not need to save the changes. If you are beginning here, start
ArcMap and load the Image Analysis for ArcGIS extension.
2. Click the Add Data
3. Hold down the Ctrl key, and select both tm_oct87.img and
tm_oct89.img.
4. Click Add to display the images in your view.
This image shows an area damaged by Hurricane Hugo.
28
Quick-Start Tutorial
Before you calculate thematic change, you must first categorize the
before and after themes. You can access the categorize function
through unsupervised classification, which is an option available on the
Image Analysis dropdown list. You use the thematic themes created
from those classifications to complete the thematic change calculation.
To create three classes of land cover, follow these steps:
1. Select tm_oct87.img from the Layers dropdown list on the Image
Analysis toolbar.
3. Click the browse button for the Input Image field and navigate to the
directory with the tm_oct87.img file.
4. Type 3 in the Desired Number of Classes field.
Quick-Start Tutorial
29
5. Navigate to the directory where you want to store the output image, type
the file name (for this example, use unsupervised_class_87), and
then click Save.
6. Click OK to close the Unsupervised Classification dialog.
Note: Using unsupervised classification to categorize continuous
images into thematic classes is useful when you are unfamiliar with the
data that makes up your image. When you designate the number of
classes you want the data divided into, Image Analysis for ArcGIS
performs a calculation assigning pixels to classes depending on their
values. By using unsupervised classification, you are better able to
quantify areas of different land cover in your image. You can then
assign the classes names like water, forest, and bare soil.
7. Uncheck the tm_oct87.img check box in the ArcMap table of contents
so the original theme is not drawn in the view. This step also makes the
remaining themes draw faster.
30
Quick-Start Tutorial
To give the classes names and assign colors to represent them, follow
these steps:
1. Double-click unsupervised_class_87.img in the ArcMap table of
contents to open the Layer Properties dialog.
2. Click the Symbology tab.
2
10
Quick-Start Tutorial
31
32
After modifying the class names and colors using the Unique Values
page on the Symbology tab on the Layers Properties dialog, you can
permanently save these changes. Using recode with the From View
option, the class names and colors are saved to a thematic image file.
Quick-Start Tutorial
2
3
4
2. Click the browse button for the Input Image field and select one of the
classified images.
3. Select From View from the Map Pixel Value through Field dropdown
list.
4. Type the name of the output image in the Output Image field, or click
the browse button, navigate to your working directory, name the output
image, and then click Save.
5. Click OK to close the Recode dialog.
Now use the same steps to perform a recode on the other classified
image of the Hugo area so that both images have your class names and
colors permanently saved.
Quick-Start Tutorial
33
3
4
5
34
Quick-Start Tutorial
You can see the amount of destruction in red. The red shows what was
forest and is now bare soil.
Quick-Start Tutorial
35
3
4
5
36
Quick-Start Tutorial
7. Click Apply and OK when you return to the Symbology tab to close the
Layer Properties dialog.
The yellow outline clearly shows the devastation within the paper
companys property boundaries.
Next, you use the summarize areas function to give area calculations of
loss inside the polygon you created.
To summarize the area, follow these steps:
1. Click the Image Analysis dropdown arrow, point to GIS Analysis, and
then select Summarize Areas to open the Summarize Areas dialog.
2
3
4
5
2. Select a file from the Zone Theme dropdown list, or navigate to the
directory where it is stored.
3. Select an attribute from the Zone Attribute dropdown list.
4. Select a file from the Class Theme dropdown list, or navigate to the
directory where it is stored.
5. Click the browse button for the Summarize Results Table field to specify
a name for the new summarize areas table that is created.
Quick-Start Tutorial
37
Exercise 5:
Mosaicking
Images
Image Analysis for ArcGIS lets you mosaic multiple images. When you
mosaic images, you join them together to form one image that covers
the entire area. In the following exercise, you mosaic two air photos with
the same resolution.
1. Clear your view by clicking the New Map File button on the ArcMap
toolbar if you are starting immediately after the previous exercise. You
do not need to save the changes. If you are beginning here, start
ArcMap and load the Image Analysis for ArcGIS extension with a new
map.
2. Click the Add Data
3. Hold down the Ctrl key and select both Airphoto1.img and
Airphoto2.img.
4. Click Add to display the images in your view.
38
Quick-Start Tutorial
Quick-Start Tutorial
39
2. Click the Pan button on the Tool toolbar, and then maneuver the images
in the view.
2
3. Click the Full Extent button so that both images display their entirety in
the view.
40
Quick-Start Tutorial
2
3
4
Quick-Start Tutorial
41
42
Quick-Start Tutorial
Exercise 6:
Orthorectifying
Images
The Image Analysis for ArcGIS extension for ArcGIS has a feature
called GeoCorrection properties. The function of this feature is to rectify
images. One of the tools that makes up GeoCorrection properties is the
camera model.
In this exercise you orthorectify images using the camera model in
GeoCorrection properties.
3. Hold down the Ctrl key and select both ps_napp.img and
ps_streets.shp.
4. Click Add to display the images in your view.
5. Right-click ps_napp.img in the ArcMap table of contents and select
Zoom to Layer.
The image is drawn in the view. You can see the fiducial markings
around the edges and at the top.
Quick-Start Tutorial
This procedure defines the coordinate system for the data frame in
Image Analysis for ArcGIS.
43
To select the coordinate system for the image, follow these steps:
1. Right-click the image in your view and select Data Frame Properties
to open the Data Frame Properties dialog.
2. Click the Coordinate System tab.
2
44
Quick-Start Tutorial
4
5
6
7
Quick-Start Tutorial
45
13
15
12
10
13
11
17
1.
-106.000
106.000
2.
105.999
105.9942
3.
105.998
-105.999
4.
-106.008
-105.999
46
Quick-Start Tutorial
Place Fiducials
5. Zoom in until you see the fiducial, and then click the crosshair.
Image Analysis for ArcGIS takes you to each of the four points where
you can click the crosshair in the fiducial marker.
6. Right-click the image in the ArcMap table of contents and select Zoom
to Layer after you finish placing fiducials.
You see that both the image and the shapefile display in the view.
7. View the root mean square error (RMSE) on the Fiducials tab by
reopening the Camera Properties dialog. The RMSE should be less
than 1.
Quick-Start Tutorial
47
After placing fiducials, both the image and the shapefile are shown in
the view for rectification.
Place Links
48
Quick-Start Tutorial
After placing the third link, your image should look something like this:
4. Zoom to the lower-left corner of the image, and place a link according
to the previous image.
Your image should warp and become aligned with the streets shapefile.
You can use the Zoom tool to draw a rectangle around the aligned area
and zoom in to see it more clearly.
Now take a look at the Total RMSE field on the Links tab on the Camera
Properties dialog. Your RSME should be less than 1. If the error is
higher than 1, you might need to redo the point selection. Remove the
point first by clicking it, and then pressing the Delete key.
5. Select Save As from the Image Analysis dropdown list to save the
image.
Quick-Start Tutorial
49
Whats Next?
50
Quick-Start Tutorial
Image Info Gives you the ability to apply a NoData value and
recalculate statistics.
IN THIS CHAPTER
Seed Tool
Image Info
Options Dialog
Geoprocessing Tools
51
51
Seed Tool
You use the Seed tool by clicking it on the Image Analysis toolbar, and
then clicking an image after generating a shapefile. The defaults usually
produce a good result. However, if you want more control over the
parameters of the Seed tool, you can use the Seed Tool Properties
dialog by selecting it from the Image Analysis dropdown list.
Figure 7: Seed Tool Properties Dialog Box
52
Seed Radius
When you use the simple click method, the Seed tool is controlled by
the seed radius. You can change the number of pixels of the seed
radius using the Seed Radius field in the Seed Tool Properties dialog.
The Image Analysis for ArcGIS default seed radius is 5 pixels.
The seed radius determines how selective the Seed tool is when
selecting contiguous pixels. A larger seed radius includes more pixels
to calculate the range of pixel values used to grow the polygon, and
typically produces a larger polygon. A smaller seed radius uses fewer
pixels to determine the range. Setting the seed radius to 0.5 or less
restricts the polygon to growing over pixels with the exact value as the
pixel you click in the image. This is useful for thematic images in which
a contiguous area has a single pixel value instead of a range of values
like continuous data.
Island Polygons
The other option in the Seed Tool Properties dialog is Include Island
Polygons. Leave this option checked if you want to include small, noncontiguous polygons. You can turn it off for single feature mapping
where you want to see a more refined boundary.
To activate the Seed tool and generate a polygon in your image, follow
these steps:
1. Click the ArcCatalog button on the Standard toolbar to open
ArcCatalog, and then make sure your working directory displays in the
window.
1
53
2. Click File, point to New, and then select Shapefile to open the Create
New Shapefile dialog.
3
4
54
7. Click either the Select, Import, or New button and enter the coordinate
system for the new shapefile to use. Clicking Import lets you import the
coordinates of the image you are creating the shapefile for.
55
To change the seed radius and include island polygons, follow these
steps:
1. Select Seed Tool Properties from the Image Analysis dropdown list to
open the Seed Tool Properties dialog.
2
3
56
Image Info
When analyzing images, you often have pixel values you need to alter
or manipulate to perceive different parts of the image better. The Image
Info feature of Image Analysis for ArcGIS lets you choose a NoData
value for your image so that a pixel value that is unimportant in your
image can be designated as such and is excluded from processes like
statistics calculations.
You can find the Image Info dialog on the Image Analysis dropdown list.
When you open this dialog, the images in your view display in the Layer
Selection dropdown list. You can use the Image Info dialog as follows:
NoData Value
Type a value in the NoData Value field to set the NoData pixels in
your image.
You can close and refresh the ArcMap display to see the NoData
value applied by clicking the refresh button at the bottom of the
ArcMap window.
The NoData Value section of the Image Info dialog lets you label certain
areas of your image as NoData when the pixel values in that area are
not important to your statistics or image. To do so, assign a value that
no other pixel in the image has to the pixels you want to classify as
NoData. Using 0 is not always recommended because 0 can be a valid
value in your image. Look at the Minimum and Maximum values in the
Statistics box and choose a NoData value that is any number between
these minimum and maximum values. You can type N/A or leave the
area blank so that you have no NoData assigned if you don't want to use
this option.
57
3
5
8
6
4
9
3. Click the Layer Selection dropdown arrow to make sure the correct
image is displayed.
4. Click either the All Bands or Current Band button.
5. Click the Statistics dropdown arrow to make sure the layer you want to
recalculate is selected if you clicked the Current Band button.
6. Type a value in the NoData Value field, or type N/A if you dont want to
assign a pixel to the NoData value.
7. Make sure the correct representation type is chosen for your image.
8. Click the Recalc Stats button to recalculate the statistics using the
NoData value.
9. Click Apply and OK to close the Image Info dialog.
10. Click the refresh button at the bottom of the ArcMap window to refresh
the display.
58
Options Dialog
You can access the Options dialog using the Image Analysis dropdown
list. This dialog lets you set an analysis mask as well as the extent, cell
size, preferences, and raster for a single operation or future operations.
These options default to process appropriate options, but you can
change them if necessary. You can use the Options dialog with any
Image Analysis feature, but it is particularly useful with the Data
Preparation features that are covered in Using Data Preparation on
page 71.
Note: You can specify any of the settings in the Options dialog from the
Environment Settings dialog, which can be displayed by clicking the
Environments button at the bottom of any process dialog. You can also
access this dialog by clicking Tools/Options/Geoprocessing tab, and
then clicking the Environments button.
The Options dialog has five tabs: General Tab, Extent Tab, Cell Size
Tab, Preferences Tab, and Raster Tab.
General Tab
On the General tab, your output directory displays and the analysis
mask defaults to none. However, if you click the Analysis Mask
dropdown arrow, you can set it to any file.
Figure 8: General Tab
You can store your output images and shapefiles in one working
directory by navigating to that directory or typing the directory name in
the Working Directory field. This allows your working directory to
automatically come up every time you click the browse button for an
output image. The Analysis Coordinate System option lets you choose
which coordinate system to save the image withthe one for the input,
the one for the active data frame, or the one you specify.
59
Extent Tab
The Extent tab lets you control how much of a theme you want to use
during processing. You can do this by setting the analysis extent.
Figure 9: Extent Tab
All of the settings on the Extent tab become active when you select any
of the following extents from the Analysis Extent dropdown list:
Same as Layer Lets you set the extent of processing to the same
extent of another layer in your table of contents. You can also click
the browse button to select a dataset to use as the analysis extent.
If you click this button, you can navigate to the directory where your
data is stored and select a file.
As Specified Below Lets you fill in the information for the extent.
When you select an extent that activates the rest of the Extent tab, the
fields are Top, Right, Bottom, and Left. If you are familiar with the data
and want to enter exact coordinates, you can do so in these fields.
Same as Display and As Specified Below also activate the Snap Extent
To dropdown list, allowing you to select an image to snap the analysis
mask to.
60
Use Function Defaults The extent that lets you use the general
processing defaults for the specific function or any settings you set
in the Environment Settings dialog.
The third tab on the Options dialog is Cell Size. This is for the output cell
size of images you produce using Image Analysis for ArcGIS.
Figure 10: Cell Size Tab
The first field on the Cell Size tab is the Analysis Cell Size dropdown list.
The options in this list are as follows:
61
As Specified Below Lets you enter the cell size you want, and
Image Analysis for ArcGIS adjusts the output accordingly.
Same as Layer "...." Indicates a layer in the view, and the Cell
Size field reflects the current cell size of that layer.
The cell size displays in either meters or feet. You can change the cell
size by selecting Data Frame Properties from the View menu in ArcMap
to open the Data Frame Properties dialog. Click the General tab, and
then select either Meters or Feet from the Map dropdown list in the Units
box.
You should not manually update the Number of Rows and Number of
Columns fields on the Cell Size tab because they automatically update
as analysis properties are changed.
Preferences Tab
62
Disadvantages
63
Disadvantages
Suitable for use before classification. Can drop data values, while
duplicating other values.
Easiest of the three methods to
compute, and the fastest to use.
Appropriate for thematic files, which
can have data file values based on a
qualitative (nominal or ordinal) or
quantitative (interval or ratio) system.
The averaging performed with
bilinear interpolation and cubic
convolution is not suited to a
qualitative class value system.
64
Disadvantages
Raster Tab
65
There are several raster formats with differences between the support
offered by ESRI and by ERDAS. These differences can be attributed to
a variant of that format, how data is stored, or the amount of data that
can be read to improve the display accuracy of that file. In some cases,
ERDAS lets you use ERDAS libraries to support a certain format. This
ensures you the same level of format support as in the past.
Note: It is recommended that you leave the supported formatting
options enabled because disabling a format results in that formats
handling being delivered by ESRI.
The following formats are supported:
SOCET SET
IMAGINE Image
TIFF
NITF
The following steps take you through the settings you can change in the
Options dialog. You can display the Options dialog by selecting Options
from the Image Analysis dropdown list.
2
3
4
2. Click the browse button for the Working Directory field and navigate to
your working directory.
3. Select an analysis mask from the Analysis Mask dropdown list, or
navigate to a directory and select one.
66
2. Type coordinates in the Top, Left, Right, and Bottom fields if you
selected As Specified Below from the Analysis Extent dropdown list.
3. Select an image from the Snap Extent To dropdown list, or navigate to
the directory where it is stored if you activated this option by selecting
As Specified below or Same as Display from the Analysis Extent
dropdown list.
4. Click the Cell Size tab to change cell sizes, or click Apply and OK to
close the Options dialog.
67
1
2
3
4
2. Type a cell size in the Cell Size field if you selected As Specified Below
from the Analysis Cell Size dropdown list.
3. Change the number in the Number of Rows field if necessary.
4. Change the number in the Number of Columns field if necessary.
5. Click the Preferences tab to change preferences, or click Apply and
OK to close the Options dialog.
Using the Preferences Tab
The Preferences tab on the Options dialog has only one option that lets
you resample using either Nearest Neighbor, Bilinear Interpolation, or
Cubic Convolution using the Resample Using dropdown list. Bilinear
Interpolation is the default, and it is recommended that you leave the
preference to this default except for thematic data processing, in which
case you should change it to Nearest Neighbor.
Using the Raster Tab
The Raster tab on the Options dialog has several raster formatting
options that are already enabled. It is recommended that you leave the
supported formatting options enabled because disabling a format
results in that formats handling being delivered by ESRI.
68
Geoprocessing
Tools
In Image Analyst for ArcGIS, all functions with the exception of Single
Frame GeoCorrection and the Seed tool, are provided as
geoprocessing tools. This lets you take full advantage of the ArcGIS
geoprocessing environment. Benefits include:
If you are new to geoprocessing, we suggest that you read the ArcGIS
Desktop online help topics for geoprocessing.
Specifying
Geoprocessing Options
Updating Existing
Geoprocessing Models
69
70
Subset Image
Mosaic Images
Reproject Image
71
The create new image function makes it easy to create a new image file.
It lets you define the size and content of the file. It also lets you specify
whether the new image type is thematic or continuous:
The table below summarizes the values appropriate for the various data
types.
Table 4: Data Type Values
Data Type
Minimum
Value
Maximum
Value
Unsigned 1 bit
Unsigned 2 bit
Unsigned 4 bit
15
Unsigned 8 bit
255
Signed 8 bit
-128
127
Unsigned 16 bit
65,535
Signed 16 bit
-32,768
32,767
-2 billion
2 billion
Unsigned 32 bit
Signed 32 bit
Float Single
72
Initial Value Lets you specify the value given to every cell in the
new file.
2
3
4
5
6
7
2. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
3. Click either the Thematic or Continuous button as the output image
type.
4. Type the number of columns and rows, if different from the default
number of 512, in the Columns and Rows fields.
5. Select a data type from the Data Type dropdown list.
6. Type the number of layers that you want in the Number of Layers field.
7. Type the initial pixel value in the Initial Value field.
8. Click OK to create the image and close the Create New Image dialog.
73
Subset Image
The subset image function lets you copy a portion (a subset) of an input
data file into an output data file. This may be necessary if you have an
image file that is much larger than the area you need to study.
Subsetting an image has the advantage of eliminating extraneous data
and speeding up processing by reducing the file size. This is important
when dealing with multiband data.
You can use subset image to subset an image either spatially or
spectrally:
If you want to specify a particular area to subset, click the Zoom In tool
and draw a rectangle over the area. Next, display the Options dialog
and select Same As Display from the Analysis Extent dropdown list on
the Extent tab.
Figure 13: Extent Tab
74
75
The illustrations that follow reflect images using the Spatial Subsetting
option.
Figure 16: Pentagon Image before Spatial Subsetting
76
Subsetting an Image
Spectrally
3. Click the browse button for the Input Image field and navigate to the
directory where your image is located.
4. Type the number of bands you want present in your output in the Select
Desired Band Numbers field, using a comma for separation.
5. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
6. Click OK to close the Subset Image dialog.
Subsetting an Image
Spatially
2. Click the Zoom In tool, and draw a rectangle over the area you want to
subset.
2
77
Mosaic Images
78
You can mosaic images with different cell sizes or resolutions. The
output cell size defaults to the maximum cell size. For example, if you
mosaic two images, one with a 4 m resolution and one with a 5 m
resolution, the output mosaicked image has a 5 m resolution. You can
set the cell size to whatever cell size you like using the Options dialog
or the Environment Settings dialog. However, the data cannot be
created to compensate if you specify a higher resolution than your
input. You still have the same coarseness of the original file.
The Extent tab on the Options dialog defaults to Union of Inputs for
mosaicking images. If, for some reason, you want to use a different
extent, you can change it in the Options dialog and check the Use
Extent from Analysis Options check box in the Mosaic Images dialog. It
is recommended that you leave it at the default of Union of Inputs.
For mosaicking images, you should resample using the Nearest
Neighbor option on the Preferences tab. This ensures that the
mosaicked pixels do not differ in their appearance from the original
image. Other resampling methods use averages to compute pixel
values and can produce an edge effect.
With the Mosaic tool you are given a choice of how to handle image
overlaps by using the Order Displayed, Maximum Value, Minimum
Value, or Average Value settings:
Order Displayed Replaces each pixel in the overlap area with the
pixel value of the image that is on top in the view.
Maximum Value Replaces each pixel in the overlap area with the
greater value of corresponding pixels in the overlapping images.
Minimum Value Replaces each pixel in the overlap area with the
lesser value of the corresponding pixels in the overlapping images.
Average Value Replaces each pixel in the overlap area with the
average of the values of the corresponding pixels in the overlapping
images.
79
Mosaicking Images
4
5
6
7
8
3. Arrange the images in the order that you want them in the mosaic using
the arrows to the right of the box below the Input Images field.
4. Select the method you want to use from the Handle Image Overlaps By
dropdown list.
5. Check the Automatically Crop Images By check box to automatically
crop images, and then type the percent by which to crop the images in
the Percent field.
6. Click a button for the color balancing in the Color Balance By box.
7. Check the Use Extent from Analysis Options check box if you want
to use the extent you set in the Options dialog.
8. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
9. Click OK to mosaic the images and close the Mosaic Images dialog.
For more information on mosaicking images, see Quick-Start
Tutorial on page 11.
80
Reproject Image
Use Reproject Image to reproject raster image data from one map
projection to another. The new projection is specified in the Reproject
Image dialog.
Optionally, you can specify an output coordinate system using the
Spatial Reference Properties dialog. You can display this dialog by
clicking the button adjacent to the Output Coordinate System field.
However, if you do not want to specify an output coordinate system,
leave the field blank.
Reprojecting an Image
3
4
6
3. Select the file you want to use from the Input Image dropdown list, or
click the browse button and navigate to the directory where it is stored.
4. Click the button for the Output Coordinate System field to open the
Spatial Reference Properties dialog.
5. Specify your coordinate system settings, and then click OK to return to
the Reproject Image dialog.
6. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
7. Click OK to close the Reproject Image dialog.
81
82
Zero Spatial Frequency A flat image in which every pixel has the
same value.
Spatial enhancement has functions for convolution filtering, nondirectional edge, focal analysis, and resolution merge to enhance your
images.
This chapter focuses on the explanation of these features as well as
how to apply them to your data. It is organized according to the order in
which the spatial enhancement tools appear. You can skip ahead if the
information you seek is about one of the tools near the end of the list.
IN THIS CHAPTER
Convolution
Non-Directional Edge
Focal Analysis
Resolution Merge
83
83
Convolution
Applying convolution
Filtering
Convolution Example
You can understand how one pixel is convolved by imagining that the
convolution kernel is overlaid on the data file values of the image (in one
band) so that the pixel being convolved is in the center of the window.
Data
-1
-1
-1
-1
16
-1
-1
-1
-1
Kernel
84
Compute the output value for this pixel by multiplying each value in the
convolution kernel by the image pixel value that corresponds to it.
These products are summed, and the total is divided by the sum of the
values in the kernel, as shown in this equation:
integer [((-1 8) + (-1 6) + (-1 6) +
(-1 2) + (16 8) + (-1 6) +
(-1 2) + (-1 2) + (-1 8))/
: (-1 + -1 + -1 + -1 + 16 + -1 + -1 + -1 + -1)]
= int [(128-40) / (16-8)]
= int (88 / 8) = int (11) = 11
When the 2 x 2 set of pixels near the center of this 5 x 5 image is
convolved, the output values are:
1
11
11
Convolution Formula
The following formula is used to derive an output data file value for the
pixel being convolved (in the center):
q
f ij d ij
i=1 j=1
V = -------------------------------------------F
q
Where:
fij = The coefficient of a convolution kernel at position i,j (in the
kernel)
dij = The data value of the pixel that corresponds to fij
q= The dimension of the kernel, assuming a square kernel
(if q =3, the kernel is 3 x 3)
F = Either the sum of the coefficients of the kernel, or 1 if the
sum of coefficients is 0
V = The output pixel value
85
Zero sum kernels are kernels in which the sum of all coefficients in the
kernel equals 0. When a zero sum kernel is used, the sum of the
coefficients is not used in the convolution equation, as shown above. In
this case, no division is performed (F = 1), because division by 0 is not
defined.
This generally causes the output values to be:
Zero in areas where all input values are equal (no edges)
High-Frequency Kernels
-1
-1
-1
-2
86
-1
-1
-1
-1
16
-1
-1
-1
-1
AFTER
204
200
197
201
106
209
10
198
200
210
...the low value gets lower. Inversely, when the high-frequency kernel is
used on a set of pixels in which a relatively high value is surrounded by
lower values...
BEFORE
AFTER
64
60
57
61
125
69
188
58
60
70
Low-Frequency Kernels
This kernel averages the values of the pixels, causing them to be more
homogeneous. The resulting image looks either more smooth or more
blurred.
Figure 18: Convolution with High-Pass Filtering
87
Applying Convolution
2
3
4
5
2. Select a file from the Input Image dropdown list, or navigate to the
directory where the file is stored.
3. Select a kernel to use from the Kernel dropdown list.
4. Click either the Reflection or Background Fill button to specify the
way to handle edges in the image. See Using Convolution for more
information.
5. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
88
Non-Directional
Edge
Sobel=
Prewitt=
1 2 1
0 0 0
1 2 1
1 0 1
2 0 2
1 0 1
Horizontal
Vertical
1 1 1
0 0 0
1 1 1
1 0 1
1 0 1
1 0 1
Horizontal
Vertical
89
90
Applying Non-Directional
Edge
2
3
4
5
2. Select a file from the Input Image dropdown list, or navigate to the
directory where the file is stored.
3. Click either the Sobel or Prewitt button to specify the filter to use.
4. Click either the Reflection or Background Fill button to specify the
way to handle edges in the image. For more information, see Using
Non-Directional Edge.
5. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
6. Click OK to close the Non-Directional Edge dialog.
Using Non-Directional Edge
In step 4 in the previous section, Reflection fills in the area beyond the
edge of the image with a reflection of the values at the edge.
Background Fill uses zeros to fill in the kernel area beyond the edge of
the image.
91
Focal Analysis
The focal analysis function lets you perform one of several types of
analysis on class values in an image file using a process similar to
convolution filtering.
This model (Median Filter) is useful for reducing noise such as random
spikes in data sets, dead sensor striping, and other impulse
imperfections in any type of image. It is also useful for enhancing
thematic images.
Focal analysis evaluates the region surrounding the pixel of interest
(center pixel). The operations you can perform on the pixel of interest
include:
Sum
Min
Max
These functions let you select the size of the surrounding region to
evaluate by selecting the window size.
Figure 21: Image before Focal Analysis
92
2
3
4
5
6
2. Select a file from the Input Image dropdown list, or navigate to the
directory where the file is stored.
3. Select the function to use from the Focal Function dropdown list.
4. Select a shape from the Neighborhood Shape dropdown list.
5. Select a matrix size from the Neighborhood Definition - Matrix Size
dropdown list.
93
6. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
7. Click OK to close the Focal Analysis dialog.
Using Focal Analysis
Focal analysis is similar to convolution in the process it uses. With focal
analysis, you can perform several different types of analysis on the pixel
values in an image file.
Resolution Merge
Brovey Transform
94
Applying Resolution
Merge
2
3
4
2. Select a file from the High Resolution Image dropdown list, or navigate
to the directory where the file is stored.
3. Select a file from the Multi-Spectral Image dropdown list, or navigate to
the directory where the file is stored.
4. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
95
96
LUT Stretch
Histogram Equalization
Histogram Matching
Brightness Inversion
97
97
LUT Stretch
LUT stretch creates an output image that contains the data values as
modified by a lookup table (LUT). The output is three bands.
Contrast Stretch
Nonlinear Contrast
Stretch
Piecewise Linear
Contrast Stretch
98
There are variations of the contrast stretch you can use to change the
contrast of values over a specific range, or by a specific amount. By
manipulating the lookup tables as in the following illustration, you can
bring out the maximum contrast in the features of an image.
99
This figure shows how the contrast stretch manipulates the histogram
of the data, increasing contrast in some areas and decreasing it in
others. This is also a good example of a piecewise linear contrast
stretch, which is created by adding breakpoints to the histogram.
Figure 24: Contrast Stretch
2
3
2. Select the file you want to use from the Input Image dropdown list, or
navigate to the directory where it is stored.
100
3. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
4. Set the output type to TIFF.
5. Click OK to close the LUT Stretch dialog.
Using LUT Stretch
LUT stretch provides a means of producing an output image that has
the stretch built into the pixel values to use with packages that have no
stretching capabilities.
Histogram
Equalization
Original Histogram
Peak
Tail
After Equalization
Pixels at
tail are
grouped
contrast
is lost
Pixels at peak are spread
apartcontrast is gained
101
Number of Pixels
40
30
A = 24
15
10
5
10
5
102
H k + ------i
2
k = 1
B i = int ---------------------------------------A
Where:
A =Equalized number of pixels per bin (see above)
Hi =The number of values with the value i (histogram)
int=Integer function (truncating real numbers to integer)
Bi=Bin number for pixels with value i
Source: Modified from Gonzalez and Wintz 1977
The 10 bins are rescaled to the range 0 to M. In this example, M = 9
because the input values ranged from 0 to 9 so that the equalized
histogram can be compared to the original. The output histogram of this
equalized image looks like the following illustration:
Numbers inside bars are input data file values
Number of Pixels
60
60
40
30
4
5
A = 24
20
2
1
15
7
8
0
0
15
0
5
9
7
Effect on Contrast
By comparing the original histogram of the example data with the last
one, you can see that the enhanced image gains contrast in the peaks
of the original histogram. For example, the input range of 3 to 7 is
stretched to the range 1 to 8. However, data values at the tails of the
original histogram are grouped together. Input values 0 through 2 all
have the output value of 0. So, contrast among the tail pixels, which
usually make up the darkest and brightest regions of the input image, is
lost.
103
The resulting histogram is not exactly flat because pixels rarely group
together into bins with an equal number of pixels. Sets of pixels with the
same value are never split up to form equal bins.
Performing Histogram
Equalization
2
3
4
2. Select the file you want to use from the Input Image dropdown list, or
navigate to the directory where it is stored.
3. Type the number of bins in the Number of Bins field.
4. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
5. Click OK to close the Histogram Equalization dialog.
104
Histogram
Matching
Relative dark and light features in the image should be the same.
105
(A)
Frequency
Frequency
(B)
255
0
Input
255
Input
Frequency
(C)
Input
255
Performing Histogram
Matching
2
3
4
106
2. Select the file you want to use from the Input Image dropdown list, or
navigate to the directory where it is stored.
3. select the file you want to use from the Match Image dropdown list, or
navigate to the directory where it is stored.
4. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
5. Click OK to close the Histogram Match dialog.
Using Histogram Matching
Histogram matching mathematically determines a lookup table that
converts the histogram of one image to resemble the histogram of
another, and is particularly useful for mosaicking images or change
detection.
Perform histogram matching when using matching data of the same or
adjacent scenes that were gathered on different days and have
differences due to the angle of the sun or atmospheric effects.
Brightness
Inversion
0.1
DNin
107
108
Applying Brightness
Inversion
2
3
2. Select the file you want to use from the Input Image dropdown list, or
navigate to the directory where it is stored.
3. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
4. Click OK to close the Brightness Inversion dialog.
109
110
Extract new bands of data that are more interpretable to the eye
RGB to IHS
IHS to RGB
Vegetative Indices
111
111
RGB to IHS
Hue
112
Intensity
Saturation
The following algorithm was used in the Image Analysis for ArcGIS
RGB to IHS transform (Conrac 1980):
M rR = -------------Mm
M gG = -------------Mm
M bB = -------------Mm
Where:
R, G, B= Are each in the range of 0 to 1.0
r, g, b=Are each in the range of 0 to 1.0
M= Largest value, r, g, or b
m= Least value, r, g, or b
At least one of the R, G, or B values is 0, corresponding to the color with
the largest value, and at least one of the R, G, or B values is 1,
corresponding to the color with the least value.
The equation for calculating intensity in the range of 0 to 1.0 is:
+ mI = M
--------------2
The equations for calculating saturation in the range of 0 to 1.0 are:
If M = m, S = 0
M mIf I 0.5, S = --------------M+m
Mm
If I > 0.5, S = -----------------------2Mm
113
2
3
2. Type the name of the input image in the Input Image field, or navigate
to the directory where it is stored.
3. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
4. Click OK to close the RGB to IHS dialog.
Using RGB to IHS
Using RGB to IHS applies an algorithm that transforms RGB values to
IHS values.
114
IHS to RGB
115
2
3
2. Click the browse button for the Input Image field and navigate to the
directory where the input image is stored.
3. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
4. Click OK to close the IHS to RGB dialog.
Using IHS to RGB
Using IHS to RGB applies an algorithm that transforms IHS values to
RGB values.
116
Vegetative Indices
Applications
Examples
IR/R (infrared/red)
SQRT (IR/R)
IR R- + 0.5
--------------IR + R
117
Image Algebra
Sensor
IR Band
R Band
Landsat MSS
SPOT XS
Landsat TM
NOAA AVHRR
DNir - DNred
yields a simple, yet very useful, measure of the presence of vegetation.
Band ratios are also commonly used. These are derived from the
absorption spectra of the material of interest. The numerator is a
baseline of background absorption, and the denominator is an
absorption peak. NDVI is a combination of addition, subtraction, and
division.
IR RNDVI = --------------IR + R
118
Applying Vegetative
Indices
2
3
4
5
6
2. Click the browse button for the Input Image field and navigate to the
directory where the input image is stored.
3. Select an appropriate layer from the Near Infrared Band dropdown list.
4. Select an appropriate layer from the Visible Red Band dropdown list.
5. Select an appropriate index from the Desired Index dropdown list.
6. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
7. Click OK to close the Vegetative Indices dialog.
119
Color IR to Natural
Color
This function lets you simulate natural colors from the bands of data
from an infrared image so that the output is a fair approximation of a
natural color image. You cannot apply this feature to images having
only one band of data (for example, grayscale images). Its for use when
you have data only from the near infrared, visible red, and visible green
segments of the spectrum.
When an image is displayed in natural color, the bands are arranged to
approximate the most natural representation of the image in the real
world. Vegetation becomes green in color, and water becomes dark in
color. Certain bands of data are assigned to the red, green, and blue
color guns of your computer monitor to create natural color.
Figure 29: Infrared Image of a Golf Course
120
Changing Color IR to
Natural Color
2
3
4
5
6
2. Click the browse button for the Input Image field and navigate to the
directory where the input image is stored.
3. Select the appropriate band from the Near Infrared Band dropdown list.
4. Select the appropriate band from the Visible Red Band dropdown list.
5. Select the appropriate band from the Visible Green Band dropdown list.
6. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
7. Click OK to close the Color IR to Natural Color dialog.
121
122
Neighborhood Analysis
Thematic Change
Summarize Areas
Recode
123
123
Information Versus
Data
You can enter data into a GIS and produce information. The information
you want to derive determines the type of data you must enter. For
example, if you are looking for a suitable refuge for bald eagles, zip
code data is probably not needed, while land cover data might be
useful.
For this reason, the first step in any GIS project is usually an
assessment of the scope and goals of the study. Once the project is
defined, you can begin the process of building the database. Although
software and data are commercially available, you must create a
custom database for the particular project and study area. Design the
database to meet the needs and objectives of the organization.
A major step in successful GIS implementation is analysis. In the
analysis phase, data layers are combined and manipulated to create
new layers and to extract meaningful information from them.
Once the database (layers and attribute data) is assembled, the layers
are analyzed and new information is extracted. Some information is
extracted by looking at the layers and visually comparing them to other
layers. However, you can retrieve new information by combining and
comparing layers.
124
Neighborhood
Analysis
Density Produces the number of pixels that have the same class
value as the center (analyzed) pixel. It also measures homogeneity
(sameness) based upon the analyzed pixel. This is often useful in
assessing vegetation crown closure.
125
Performing
Neighborhood Analysis
Sum Totals the class values. In a file where class values are
ranked, totaling lets you further rank pixels based on their proximity
to high-ranking pixels.
2
3
4
5
6
2. Click the browse button for the Input Image field and navigate to the
directory where the input image is stored.
3. Select the function you want to use from the Neighborhood Function
dropdown list.
4. Select Rectangle from the Neighborhood Shape dropdown list.
5. Select the size you want to use from the Neighborhood Definition Matrix Size dropdown list.
6. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
7. Click OK to close the Neighborhood Analysis dialog.
126
Thematic Change
127
Performing Thematic
Change
3
4
2. Click the browse button for the Before Theme field and navigate to the
directory where the before theme image is stored.
3. Click the browse button for the After Theme field and navigate to the
directory where the after theme image is stored.
4. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
5. Click OK to close the Thematic Change dialog.
Note: You must use the output from this exercise in the next exercise,
Summarize Areas on page 130. Please keep the image displayed in
your view if you are going on to that exercise.
128
Summarize Areas
129
Applying Summarize
Areas
2
3
4
5
2. Select the vector theme you want to use from the Zone Theme
dropdown list, or navigate to the directory where it is stored.
3. Select the attribute you want to summarize from the Zone Attribute
dropdown list.
4. Select the class theme from the Class Theme dropdown list, or navigate
to the directory where it is stored. This is the thematic theme you
generated in Thematic Change on page 128.
5. Click the browse button for the Summarize Results Table field to specify
a name for the new summarize areas table that is created.
6. Click OK to close the Summarize Areas dialog.
When the process completes, the resulting table is added to ArcMap.
7. Click the Source tab in the ArcMap table of contents to see the new
table.
130
Recode
Combine classes
Recoding by symbology
131
The following is a thematic image of South Carolina soil types after the
recode. The changed and grouped class names are listed in the table
of contents.
Figure 36: Thematic Image after Recode by Class Name
132
You must first group the classified image in the ArcMap table of
contents and then perform the recode.
To recode by class name, follow these steps:
1. Click the Add Data
133
134
10
11
10. Type the name of the output image in the Output Image field, or click
the browse button and navigate to the directory where you want the
output image stored and type a name.
11. Click OK to close the Recode dialog.
Recoding by Symbology
135
10
136
6. Press the Ctrl key while clicking the first set of classes you want to
group together.
7. Right-click the selected classes and select Group Values.
8. Click in the Label column and type a new name for the class.
9. Follow steps steps 5 - 8 to group the rest of your classes.
10. Click Apply and OK to close the Layer Properties dialog.
11. Click the Image Analysis dropdown arrow, point to GIS Analysis, and
then select Recode to open the Recode dialog.
12. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
13. Click OK to close the Recode dialog.
Recoding with
Previously Grouped
Image
You may need to open an image that has been classified and grouped
in another program such as ERDAS IMAGINE. These images might
have more than one valid attribute column that you can use to perform
the recode.
To recode using a previously grouped image, follow these steps:
1. Click the Add Data
2. Click the Image Analysis dropdown arrow, point to GIS Analysis, and
then select Recode to open the Recode dialog.
3
4
5
3. Click the browse button for the Input Image field to navigate to the
directory where your image is located and select the image.
4. Select an attribute category, which must be an integer field, that you
want to use to recode the image from the Map Pixel Value through Field
dropdown list.
137
5. Type the name of the output file in the Output Image field, or click the
browse button and navigate to the directory where you want the output
image stored and type a name.
6. Click OK to close the Recode dialog.
The following images depict soil data that was previously grouped in
ERDAS IMAGINE.
Figure 37: Soil Data Image before Recode
138
Using Utilities
At the core of Image Analysis for ArcGIS is its ability to interpret and
manipulate your data. The Utilities section of Image Analysis for ArcGIS
provides a number of features for you to use in this capacity. These
features let you alter your images to see differences, set new
parameters, create images, or change the data type of your image.
IN THIS CHAPTER
Using Utilities
Using Utilities
Image Difference
Layer Stack
Rescale Image
139
139
Image Difference
140
Using Utilities
Using Utilities
141
3
4
5
6
7
8
9
10
11
12
3. Click the browse button for the Before Theme field and navigate to the
directory where the file is stored.
4. Select a layer from the Before Layer dropdown list.
5. Select the file you want to use from the After Theme dropdown list, or
navigate to the directory where it is stored.
6. Click the browse button for the After Layer field and navigate to the
directory where the layer you want to use is stored.
7. Click either the As Percent or As Value button in the Highlight
Changes box.
8. Type values in the Increases More Than and Decreases More Than
fields.
9. Click a color bar and select the color you want to represent the
increases and decreases.
10. Click the browse button for the Image Difference File field and navigate
to the directory where you want the output image stored.
11. Click the browse button for the Highlight Change File field and navigate
to the directory where you want it stored.
142
Using Utilities
Layer Stack
Layer stack lets you stack layers from different images in any order to
form a single theme. It is useful for combining different types of imagery
for analysis such as multispectral and radar data. For example, if you
stack three single-band grayscale images, you finish with one threeband image. In general, stacking images is most useful for combining
grayscale single-band images into multiband images.
There are several applications of layer stack, such as change
visualization, combining and viewing multiple resolution data, and
viewing disparate data types. It is particularly useful if you receive a
multispectral dataset with each of the individual bands in separate files.
You can also use layer stack to analyze datasets taken during different
seasons when different sets show different stages for vegetation in an
area.
An example of a multispectral dataset with individual bands in separate
files is Landsat TM data. Layer stack quickly consolidates the bands of
data into one file.
The image on this page is an example of a layer stack output. The files
used are from the Amazon, and the red and blue bands are from one
image, while the green band is from the other. Bands 1 and 3 are taken
from the Amazon LBAND image, and the remaining layers taken from
Amazon TM.
Figure 41: Stacked Image
Using Utilities
143
2
3
5
6
2. Click the browse button for the Input Images field and navigate to a
directory where the input images are stored.
3. Check the check boxes for the layers of the input images you want to
use.
4. Click the up and down arrows to the right of the Selected Layers box if
you want to reorder the layers.
5. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
6. Click OK to close the Layer Stack dialog.
144
Using Utilities
Rescale Image
Rescale image has a tool that lets you rescale the pixel values of an
input image into a new user-specified range for the specified output
image. It lets you change the data type of the output image. For
example, you can rescale a 16-bit image with an original min-max of 065535 into an 8-bit image of 0-255 and vice versa. You can access the
Rescale tool by selecting Utilities/Rescale Image from the Image
Analysis dropdown list.
During conversion, the Rescale tool recomputes statistics for the input
image and scales the minimum and maximum values obtained into the
specified output values. A user-specified NoData value is also
assigned. If a NoData value is not required, N/A is specified for the
NoData value in the user interface (this is the same approach defined
for the ImageInfo tool).
You can use the Rescale tool by selecting an input image, and then
selecting the output data type. This populates the New Min and New
Max values to the minimum and maximum values appropriate for the
selected data type. Define the output file name and click OK.
Supported output data types are:
Unsigned 1 bit
Unsigned 2 bit
Unsigned 4 bit
Unsigned 8 bit
Signed 8 bit
Unsigned 16 bit
Signed 16 bit
Unsigned 32 bit
Signed 32 bit
Optionally, you can select a NoData value. However, if you do not want
to set a NoData value, leave it as N/A or blank in which case any
previously defined NoData value is not transferred to the output image.
A typical use for the Rescale tool is to rescale an unsigned 8-bit dataset
with valid values ranging from 0-255 into a range of 1-255, and at the
same time, set the NoData value to 0. This provides a means of
allocating a NoData value that does not interfere with valid data values.
One disadvantage of this technique for setting NoData is that pixel
values are altered, which might not be good for certain applications
such as classification.
Using Utilities
145
3
4
5
6
7
3. Select an input image from the Input Image dropdown list, or navigate
to the directory where it is stored.
4. Select an output type from the Output Type dropdown list.
5. Type an output minimum and maximum in the Output Min and Output
Max fields.
6. Type a value in the NoData Value field if you want to specify a NoData
value.
7. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
8. Click OK to close the Rescale Image dialog.
146
Using Utilities
Understanding Classification
Multispectral classification is the process of sorting pixels into a finite
number of individual classes, or categories of data, based on their data
file values. If a pixel satisfies a certain set of criteria, the pixel is
assigned to the class that corresponds to that criteria.
Depending on the type of information you want to extract from the
original data, classes can be associated with known features on the
ground or represent areas that look different to the computer. An
example of a classified image is a land cover map that shows
vegetation, bare land, pasture, urban, and so on.
This chapter covers the two ways to classify pixels into different
categories:
Unsupervised Classification
Supervised Classification
Understanding Classification
Understanding Classification
Classification Tips
Unsupervised Classification
Supervised Classification
147
147
The Classification
Process
Training
Unsupervised Training
Supervised Training
148
Understanding Classification
Signatures
Decision Rule
After the signatures are defined, the pixels of the image are sorted into
classes based on the signatures by use of a classification decision rule.
The decision rule is a mathematical algorithm that, using data contained
in the signature, performs the actual sorting of pixels into distinct class
values.
Nonparametric Decision
Rule
Minimum Distance
Mahalanobis Distance
Maximum Likelihood
Classification Tips
Understanding Classification
149
Classification Scheme
150
Understanding Classification
Supervised versus
Unsupervised
Classification
Classifying Enhanced
Data
Limiting Dimensions
Although Image Analysis for ArcGIS lets you use an unlimited number
of layers of data for one classification, it is usually wise to reduce the
dimensionality of the data as much as possible. Often, certain layers of
data are redundant or extraneous to the task at hand. Unnecessary
data takes up valuable disk space, and causes the computer system to
perform more arduous calculations, which slows down processing.
Understanding Classification
151
Unsupervised
Classification
Clusters
Clusters are defined with a clustering algorithm, which often uses all or
many of the pixels in the input data file for its analysis. The clustering
algorithm has no regard for the contiguity of the pixels that define each
cluster.
The Iterative Self-Organizing Data Analysis Technique (ISODATA)
(Tou and Gonzalez 1974) clustering method uses spectral distance as
in the sequential method. But it iteratively classifies the pixels, redefines
the criteria for each class, and classifies again, so that the spectral
distance patterns in the data gradually emerge.
ISODATA Clustering
152
Understanding Classification
Band B
Data File Values
Band A
Data File Values
Pixel Analysis
Pixels are analyzed beginning with the upper-left corner of the image
and going left to right, block-by-block.
The spectral distance between the candidate pixel and each cluster
mean is calculated. The pixel is assigned to the cluster whose mean is
the closest. The ISODATA function creates an output image file with a
thematic raster layer as a result of the clustering. At the end of each
iteration, an image file shows the assignments of the pixels to the
clusters.
Understanding Classification
153
Cluster
4
Band B
Data File Values
Cluster
3
Cluster
2
Cluster
1
Band A
Data File Values
For the second iteration, the means of all clusters are recalculated,
causing them to shift in feature space. The entire process is repeated
each candidate pixel is compared to the new cluster means and
assigned to the closest cluster mean.
Cluster
5
Cluster
4
Cluster
3
Band B
Data File Values
Cluster
2
Cluster
1
Band A
Data File Values
Percentage Unchanged
154
Understanding Classification
Performing
Unsupervised
Classification
2
3
4
2. Click the browse button for the Input Image field and navigate to the
directory where the input file is stored.
3. Type the number of classes you want in the Desired Number of Classes
field.
4. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
5. Click OK to close the Unsupervised Classification dialog.
Supervised
Classification
What classes are most likely to be present in the data? That is,
which types of land cover, soil, or vegetation (or whatever) are
represented by the data?
Understanding Classification
155
Performing Supervised
Classification
2
3
4
5
6
7
8
2. Click the browse button for the Input Image field and navigate to the
directory where the file is stored.
3. Click the browse button for the Signature Features field and navigate to
the directory where the file is stored.
4. Select the field that contains the class names from the Class Name
Field dropdown list.
5. Click either the All Features or Selected Features button to specify
which features to use during classification.
6. Select the rule you want to use from the Classification Rule dropdown
list.
Note: For more information about each option, see Classification
Decision Rules on page 157.
156
Understanding Classification
7. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
8. Click OK to close the Supervised Classification dialog.
Classification
Decision Rules
Once you create and evaluate a set of reliable signatures, the next step
is to perform a classification of the data. Each pixel is analyzed
independently. The measurement vector for each pixel is compared to
each signature according to a decision rule or algorithm. Pixels that
pass the criteria and are established by the decision rule are then
assigned to the class for that signature. Image Analysis for ArcGIS lets
you classify the data parametrically with statistical representation.
Parametric Rules
Nonparametric Rule
Minimum Distance
Mahalanobis Distance
Image Analysis for ArcGIS provides only one decision rule for
nonparametric signatures:
Minimum Distance
Parallelepiped
Candidate Pixel
Band B
Data File Values
B3
B2
B1
o
A1
A2 A3
Band A
Data File Values
Understanding Classification
157
xyc
( ci Xxyi )
i=1
Where:
n =Number of bands (dimensions)
i = A particular band
c = A particular class
Xxyi = Data file value of pixel x,y in band i
ci =Mean of data file values in band i for the sample for class c
SDxyc = Spectral distance from pixel x,y to the mean of class c
Source: Swain and Davis 1978
When spectral distance is computed for all possible values of c (all
possible classes) the class of the candidate pixel is assigned to the
class for which SD is the lowest.
Maximum Likelihood
1
ln ( a c ) [ 0.5 ln ( Cov c ) ] 0.5 ( X M c )T Cov c ( X M c )
Where:
D = Weighted distance (likelihood)
c =A particular class
X = The measurement vector of the candidate pixel
Mc = The mean vector of the sample of class c
ac = Percent probability that any candidate pixel is a member of
class c (defaults to 1.0, or is entered from a priori data)
Covc = The covariance matrix of the pixels in the sample of class c
|Covc|=Determinant of Covc (matrix algebra)
158
Understanding Classification
Mahalanobis Distance
Where:
D= Mahalanobis distance
c= A particular class
X= The measurement vector of the candidate pixel
Mc= The mean vector of the signature of class c
Covc= The covariance matrix of the pixels in the signature of
class c
Covc-1= Inverse of Covc
T
= Transposition function
Parallelepiped
Understanding Classification
159
160
Understanding Classification
Using Conversion
The conversion feature gives you the ability to convert shapefiles to
raster images and raster images to shapefiles. This tool is very helpful
when you need to isolate or highlight certain parts of a raster image or
when you have a shapefile and you need to view it as a raster image.
Possible applications include viewing deforestation patterns, urban
sprawl, and shore erosion.
The Image Info tool that is discussed in Applying Data Tools on
page 51 is also an important part of raster/feature conversion. The
ability to assign certain pixel values as NoData is very helpful when
converting images.
IN THIS CHAPTER
Using Conversion
Using Conversion
Conversion
147
147
Conversion
Converting Raster
to Features
148
Using Conversion
Performing Raster to
Features Conversion
Using Conversion
149
2
3
4
5
6
2. Click the browse button for the Input Raster field and navigate to the
directory where the file is stored.
3. Select a field to use from the Field dropdown list.
4. Select Point, Polygon, or Polyline from the Output Geometry Type
dropdown list.
5. Check the Generalize Lines check box if you want to smooth out sharp
edges in the image.
6. Type the file name of the shapefile in the Output Features field, or
navigate to the directory where you want it stored.
7. Click OK to close the Raster to Features dialog.
150
Using Conversion
Converting
Features to Raster
You can convert polygons, polylines, or points from any source file to a
raster. You can convert features using both string and numeric fields.
Each unique string in a string field is assigned a unique value to the
output raster. A field is added to the table of the output raster to hold the
original string value from the features.
When you convert points, cells are given the value of the points found
within each cell. Cells that do not contain a point are given the value of
NoData. You can specify a cell size to use in the Features to Raster
dialog. Specify a cell size based on these factors:
Using Conversion
151
Performing Features to
Raster Conversion
2
3
4
5
2. Click the browse button for the Input features field and navigate to the
directory where the file is stored.
3. Select a field to use from the Field dropdown list.
4. Type an output cell size in the Output Cell Size field.
5. Click the browse button for the Output Image field and navigate to the
directory where you want the output image stored.
6. Click OK to close the Features to Raster dialog.
152
Using Conversion
Rectification
GeoCorrection
SPOT
Polynomial Transformation
Rubber Sheeting
Camera
Landsat
153
153
Rectification
Mosaicking images
Before rectifying the data, consider the primary use for the database
before selecting the optimum map projection and appropriate
coordinate system. If you are doing a government project, the projection
is often predetermined. A commonly used projection in the United
States government is state plane. Use an equal area projection for
thematic or distribution maps and conformal or equal area projections
for presentation maps.
Consider the following before selecting a map projection:
154
Where on the globe is the study area? Polar regions and equatorial
regions require different projections for maximum accuracy.
What is the extent of the study area? Circular, north-south, eastwest, and oblique areas may all require different projection systems
(ESRI 1992).
Disadvantages of
Rectification
During rectification, you must resample the data file values of rectified
pixels to fit into a new grid of pixel rows and columns. Although some of
the algorithms for calculating these values are highly reliable, you can
lose some spectral integrity of the data during rectification. If map
coordinates or map units are not needed in the application, it might be
wiser not to rectify the image. An unrectified image is more spectrally
correct than a rectified image.
Georeferencing
Georeferencing Only
This information is usually the same for each layer of an image file,
although it can be different. For example, the cell size of band 6 of
Landsat TM data is different from the cell size of the other bands.
GCPs are specific pixels in an image for which the output map
coordinates (or other output coordinates) are known. GCPs consist of
two X,Y pairs of coordinates:
155
Entering GCP
Coordinates
Tolerance of RMSE
Use your mouse to select a pixel from an image in the view. With
both the source and destination views open, enter source
coordinates and reference coordinates for image-to-image
registration.
Classification
Thematic Files
156
Orthorectification
GeoCorrection
The GeoCorrection
Properties Dialog
General Tab
Link Coloring Lets you set a threshold and select or change link
colors.
Displayed Units Lets you view the horizontal and vertical units if
they are known. Often only one is known so it might display Meters
for vertical units and Unknown for horizontal units. Display units do
not have an effect on the original data in latitude/longitude format.
The image in the view does not show the changes either.
Links Tab
157
Note: Before adding links or editing the links table, you must select the
coordinate system in which you want to store the link coordinates.
To select a coordinate system, follow these steps:
1. Right-click in the view and select Data Frame Properties to open the
Data Frame Properties dialog.
2. Click the Coordinate System tab.
2
158
2. Select your model type from the Model Types dropdown list.
3. Click the Add Links button to set your new links.
To proof and edit the coordinates of the links as you enter them, follow
these steps:
1. Click the GeoCorrection Properties button.
2. Click the Links tab.
The coordinates display in the CellArray on this tab.
2
159
Elevation Tab
If you do not have an elevation file, click the Constant button to change
the settings in the Elevation Source box and specify the elevation value
and units. Use the constant value that is the average ground elevation
for the entire scene.
Figure 45: Elevation Source Constant Settings
Note: You can also check the Account for Earths curvature check box
if you want to use this option as part of the elevation.
160
The following steps take you through the Elevation tab. The first set of
instructions uses File as the elevation source. The second set uses
Constant as the elevation source.
To use a file value as the elevation source, follow these steps:
1. Click the File button in the Elevation Source box.
1
2
3
4
2. Type the name of the file in the Elevation File field, or navigate to the
directory where it is stored.
3. Select Feet or Meters from the Elevation Units dropdown list.
4. Check the Account for Earths Curvature check box.
5. Click Apply to set the elevation source.
6. Click OK to close the dialog.
To use a constant value as the elevation source, follow these steps:
1. Click the Constant button in the Elevation Source box.
1
2
3
4
161
SPOT
Panchromatic
162
XS
Wavelength
(Microns)
Comments
1, Green
0.50 to 0.59
m
2, Red
0.61 to 0.68
m
3,
Reflective
IR
0.79 to 0.89
m
hrom
ati
1 Band
XS
3 Bands
1 Pixel =
10 m x 10 m
Radiometric
Resolution
0-255
Stereoscopic Pairs
1 Pixel =
20 m x 20 m
163
SPOT 4
Band
Wavelength
1, Green
0.50 to 0.59 m
2, Red
0.61 to 0.68 m
3, (near-IR)
0.78 to 0.89 m
4, (mid-IR)
1.58 to 1.75 m
Panchromatic
0.61 to 0.68 m
164
4
5
Polynomial
Transformation
165
Reference X Coordinate
Transformation Matrix
GCP
Polynomial Curve
Source X Coordinate
Every GCP influences the coefficients, even if there isnt a perfect fit of
each GCP to the polynomial that the coefficients represent. The
distance between the GCP reference coordinate and the curve is called
RMSE, which is discussed later in Camera on page 177.
Linear Transformations
Location in X or Y
Scale in X or Y
Skew in X or Y
Rotation
166
You can also use a 1st order transformation for data already projected
onto a plane. For example, SPOT and Landsat Level 1B data is already
transformed to a plane, but might not be rectified to the map projection
you want. When doing this type of rectification, it is not advisable to
increase the order of transformation if a high RMSE occurs first.
Examine other factors, such as the GCP source and distribution, and
then look for systematic errors.
The transformation matrix for a 1st order transformation consists of six
coefficientsthree for each coordinate (X and Y):
a
b
0
0
a
b
1
1
a
b
2
2
= a +a x+a y
0
1
2
= b +b x+b y
0
1
2
Where:
x and y are source coordinates (input)
x0 and y0 are rectified coordinates (output)
The coefficients of the transformation matrix are as above.
Nonlinear
Transformations
i=0
It is multiplied by two for the two sets of coefficientsone set for X and
one for Y.
An easier way to arrive at the same number is:
( t + 1 )x ( t + 2 )
Clearly, the size of the transformation matrix increases with the order of
the transformation.
167
High-Order Polynomials
i = o j = o
ak x
t i
yo =
i = o j = o
bk x
ij
ij
Where:
t is the order of the polynomial
a and b are coefficients
The subscript k in a and b is determined by:
i i + -j + j
k = ----------------2
Effects of Order
Reference X Coordinate
(Output)
17
168
Where:
xr = Reference X coordinate
xi = Source X coordinate
This equation takes on the same format as the equation of a line
(y = mx + b). In mathematical terms, a 1st order polynomial is linear.
Therefore, a 1st order transformation is also known as a linear
transformation.
This equation is graphed below:
Reference X Coordinate
16
xr = (25) + (-8)xi
12
0
0
Source X Coordinate
169
Reference X
Coordinate
(Output)
17
Reference X Coordinate
16
12
0
0
Source X Coordinate
A line cannot connect these points, which illustrates why they are not
expressed by a 1st order polynomial like the graph on the left. In this
case, a 2nd order polynomial equation expresses these points.
2
x = ( 31 ) + ( 16 )x + ( 2 )x
r
i
i
Polynomials of the 2nd order or higher are nonlinear. The graph of this
curve is drawn below:
Reference X Coordinate
16
12
0
0
Source X Coordinate
170
Reference X
Coordinate
(Output)
17
Reference X Coordinate
16
12
(4,5)
4
0
0
Source X Coordinate
As illustrated in the graph above, this fourth GCP does not fit on the
curve of the 2nd order polynomial equation. You must increase the
order of the transformation to the 3rd order to ensure that all the GCPs
fit. The equation and graph below are possible results:
2
2
x r = ( 25 ) + ( 5 )x i + ( 4 )x i + ( 1 )x i
Reference X Coordinate
16
12
0
0
Source X Coordinate
171
Reference X
Coordinate
(Output)
x 0 ( 1 ) = 17
x0 ( 2 ) = 7
x0 ( 3 ) = 1
x0 ( 4 ) = 5
x0 ( 1 ) > x0 ( 2 ) > x0 ( 4 ) > x0 ( 3 )
17 > 7 > 5 > 1
Input Image
X Coordinates
1
Output Image
X Coordinates
1
3
5
4
7
2
9 10 11 12 13 14 15 16 17 18
1
172
Minimum Number of
GCPs
Minimum GCPs
Required
10
15
21
28
36
45
55
10
66
173
The Polynomial
Properties Dialog Box
2. Click OK.
174
Rubber Sheeting
Triangulation
Triangle-Based
Rectification
Once the triangle mesh is generated and the spatial order of the control
points is available, you can perform the geometric rectification on a
triangle-by-triangle basis. This triangle-based method is appealing
because it breaks the region into smaller subsets. If the geometric
problem of the region is very complicated, the geometry of each subset
is much simpler and modeled through simple transformation.
For each triangle, you can use the polynomials as the general
transformation form between source and destination systems.
Linear Transformation
xo = a 0 + a 1 x + a 2 y
yo = b 0 + b 1 x + b 2 y
175
Nonlinear
Transformation
x0 =
i = 0j
y0 =
i = 0j
ak x
ij
=0
i
bk x
ij
=0
Checkpoint Analysis
Camera
176
Orientation Tab
The Orientation tab lets you choose rotation angles and center
positions for the camera.
Figure 47: Orientation Tab
The rotation angle lets you customize the Omega, Phi, and Kappa
rotation angles of the image to determine the viewing direction of the
camera. You can choose from the following options:
Phi Phi rotation angle is pitch (around the y-axis (after Omega
rotation)).
The perspective center position is given in meters and lets you enter the
perspective center for ground coordinates. You can choose from the
following options:
177
Note: If you fill in all the degrees and meters for the rotation angle and
the perspective center position, you do not need the three links normally
required for the Camera model. If you fill in this information, do not
check the Account for Earths curvature check box on the Elevation tab.
Camera Tab
The next to last tab on the Camera Properties dialog is also called
Camera. This is where you specify the camera name, the number of
fiducials, the principal point, and the focal length for the camera that
was used to capture your image.
Figure 48: Camera Tab
You can click Load or Save to open or save a file with camera
information in it.
Fiducials Tab
The last tab on the Camera Properties dialog is the Fiducials tab.
Fiducials are used to compute the transformation from data file to image
coordinates.
Figure 49: Fiducials Tab
178
Fiducial orientation defines the relationship between the image/photocoordinate system of a frame and the actual image orientation as it
displays in a view. The image/photo-coordinate system is defined by
the camera calibration information. The orientation of the image is
largely dependent on the way the photograph is scanned during the
digitization stage.
The fiducials for your image are fixed on the frame and are visible in the
exposure. The fiducial information you enter on the Camera tab
displays in a CellArray on the Fiducials tab after you click the Apply
button in the Camera Properties dialog.
Compare the axis of the photo-coordinate system (defined in the
calibration report) with the orientation of the image to select the
appropriate fiducial orientation. Based on the relationship between the
photo-coordinate system and the image, you can select the appropriate
fiducial orientation. Do not use more than eight fiducials in an image.
179
180
IKONOS,
QuickBird, and
RPC Properties
Note: It is very important that you click the Add Links button before
clicking the GeoCorrection Properties button to open one of these
properties dialogs.
IKONOS
Wavelength
(Microns)
1, Blue
0.45 to 0.52 m
2, Green
0.52 to 0.60 m
3, Red
0.63 to 0.69 m
4, NIR
0.76 to 0.90 m
Panchromatic
0.45 to 0.90 m
The IKONOS Properties dialog lets you rectify IKONOS images from
the satellite. Like the other properties dialogs in GeoCorrection
Properties, IKONOS has General, Links, and Elevation tabs as well as
Parameters and Chipping tabs.
181
The RPC file is generated by the data provider based on the position of
the satellite at the time of image capture. You can further refine the
RPCs by using GCPs. Locate this file in the same directory as the
image you intend to use in the GeoCorrection process.
QuickBird
Wavelength
(Microns)
1, Blue
0.45 to 0.52 m
2, Green
0.52 to 0.60 m
3, Red
0.63 to 0.69 m
4, NIR
0.76 to 0.90 m
RPC
RPC properties let you specify the associated RPC file to use in
geocorrection. RPC properties in Image Analysis for ArcGIS let you
work with NITF data.
NITF data is designed to pack numerous image compositions with
complete annotation, text attachments, and imagery-associated
metadata.
The RPC file associated with the image contains rational function
polynomial coefficients generated by the data provider based on the
position of the satellite at the time of image capture. You can further
refine these RPCs by using GCPs. Locate this file in the same directory
as the images you intend to use in orthorectification.
182
Just like IKONOS and QuickBird, the RPC Properties dialog contains
the Parameters and Chipping tabs. These work the same way in all
three model properties.
There is also a check box for Refinement with Polynomial Order. This
is provided so you can apply polynomial corrections to the original
rational function model. This setting corrects the remaining error and
refines the mathematical solution. Check the Refinement with
Polynomial Order check box to enable the refinement process, and then
specify the order by clicking the arrows.
The 0 order results in a simple shift to both image X and Y coordinates.
The 1st order is an affine transformation. The 2nd order results in a 2nd
order transformation, and the 3rd order in a 3rd order transformation.
Usually, a 0 or 1st order is sufficient to reduce error not addressed by
the rational function model (RPC file).
The fields in the Elevation Range box are automatically populated by
the RPC file.
183
You are given the choice of Scale and Offset or Arbitrary Affine as your
chipping parameters on the Chipping tab. The tab changes depending
on which chipping parameter you select from the Specify Chipping
Parameters As dropdown list as described in IKONOS, QuickBird, and
RPC Properties on page 185 and IKONOS, QuickBird, and RPC
Properties on page 186.
Full Row Count and Full Column Count fields are located at the bottom
of the Chipping tab. If the chip header contains the appropriate data, the
Full Row Count value is the row count of the full, original image. If the
header count is absent, this value corresponds to the row count of the
chip.
184
Scale and Offset is the simpler of the two chipping parameters. The
formulas for calculating the affine using scale and offset are listed in a
box on the Chipping tab.
Figure 52: Chipping Tab using Scale and Offset
X and Y correspond to the pixel coordinates for the full, original image.
The options are:
185
Arbitrary Affine
The Arbitrary Affine formulas display in the box on the Chipping tab
when you select that option from the Specify chipping parameters as
dropdown list.
In the formulas, x (x prime), and y (y prime), correspond to the pixel
coordinates in the chip with which you are currently working. Values for
the variables are either obtained from the header data of the chip, or
they default to the predetermined values described above.
The following is an example of the Arbitrary Affine settings on the
Chipping tab.
Figure 53: Chipping Tab using Arbitrary Affine
Landsat
Landsat 1-5
186
MSS
Bands 1 and 2 Are in the visible portion of the spectrum and are
useful in detecting cultural features, such as roads. These bands
also show detail in water.
You can use different color schemes to enhance the features under
study. These are by no means all of the useful combinations of the
seven bands. The particular applications determine the bands to use.
TM
187
188
Band
Wave-length
(Microns)
Comments
1, Blue
0.45 to
0.52 m
2, Green
0.52 to
0.60 m
3, Red
0.63 to
0.69 m
4, NIR
0.76 to
0.90 m
5, MIR
1.55 to
1.75 m
6, TIR
7, MIR
2.08 to
2.35 m
MSS
TM
Radiometric
Resolution
0-127
Radiometric
Resolution
0- 255
7 Bands
1 Pixel =
57 m x 79 m
1 Pixel =
30 m x 30 m
You can use different color schemes to bring out or enhance the
features under study. These are by no means all of the useful
combinations of the seven bands. The application determines the
bands to use.
Landsat 7
189
One type of data available from Landsat 7 is browse data. Browse data
is a lower resolution image for determining image location, quality and
information content. Another type of data is metadata, which is
descriptive information on the image. This information is available via
the internet within 24 hours of being received by the primary ground
station. Moreover, EDC processes the data to Level 0r. This data is
corrected for scan direction and band alignment errors only. Level 1G
data, which is corrected, is also available.
Landsat 7 Specifications
Information about the spectral range and ground resolution of the bands
of the Landsat 7 satellite is provided in the following table:
Table 12: Landsat 7 Characteristics
Band
Number
Wavelength
(Microns)
Resolution
(m)
0.45 to 0.52 m
30
0.52 to 0.60 m
30
0.63 to 0.69 m
30
0.76 to 0.90 m
30
1.55 to 1.75 m
30
10.4 to 12.5 m
60
2.08 to 2.35 m
30
Panchromatic (8)
0.50 to 0.90 m
15
190
Parameters Tab
191
192
Glossary
abstract symbol
An annotation symbol that has a geometric shape, such as a
circle, square, or triangle. These symbols often represent
amounts that vary from place to place, such as population
density, yearly rainfall, and so on.
accuracy assessment
The comparison of a classification to geographical data that is
assumed to be true. Usually, the assumed true data is derived
from ground truthing.
American Standard Code for Information Interchange
A basis of character sets...to convey some control codes,
space, numbers, most basic punctuation, and unaccented
letters a-z and A-Z.
analysis mask
An option that uses a raster dataset in which all cells of interest
have a value and all other cells have no data. Analysis mask
lets you perform analysis on a selected set of cells.
ancillary data
The data, other than remotely sensed data, that is used to aid
in the classification process.
annotation
The explanatory material accompanying an image or a map.
Annotation can consist of lines, text, polygons, ellipses,
rectangles, legends, scale bars, and any symbol that denotes
geographical features.
AOI
See area of interest.
a priori
Already or previously known.
area
A measurement of a surface.
area of interest
A point, line, or polygon that is selected as a training sample or
as the image area to use in an operation.
Glossary
Glossary
193
193
ASCII
See American Standard Code for Information Interchange.
aspect
The orientation, or the direction that a surface faces, with
respect to the directions of the compass: north, south, east,
west.
attribute
The tabular information associated with a raster or vector layer.
average
The statistical mean; the sum of a set of values divided by the
number of values in the set.
band
A set of data file values for a specific portion of the
electromagnetic spectrum of reflected light or emitted heat (red,
green, blue, near-infrared, infrared, thermal, and so on) or
some other user-defined information created by combining or
enhancing the original bands, or creating new bands from other
sources. Sometimes called channel.
bilinear interpolation
Uses the data file values of four pixels in a 2 2 window to
calculate an output value with a bilinear function.
bin function
A mathematical function that establishes the relationship
between data file values and rows in a descriptor table.
bins
Ordered sets of pixels. Pixels are sorted into a specified
number of bins. The pixels are then given new values based
upon the bins to which they are assigned.
border
On a map, a line that usually encloses the entire map, not just
the image area as does a neatline.
boundary
A neighborhood analysis technique that is used to detect
boundaries between thematic classes.
brightness value
The quantity of a primary color (red, green, blue) for a pixel on
the display device. Also called intensity value, function memory
value, pixel value, display value, and screen value.
194
Glossary
buffer zone
A specific area around a feature that is isolated for or from
further analysis. For example, buffer zones are often generated
around streams in site assessment studies so that further
analyses exclude these areas that are often unsuitable for
development.
Cartesian
A coordinate system in which data is organized on a grid and
points on the grid are referenced by their X,Y coordinates.
camera properties
Camera properties are for the orthorectification of any image
that uses a camera for its sensor. The model is derived by
space resection based on collinearity equations. The elevation
information is required in the model for removing relief
displacement.
categorize
The process of choosing distinct classes to divide your image
into.
cell
1. A 1 1 area of coverage. Digital terrain elevation data
(DTED) is distributed in cells. 2. A pixel; grid cell.
cell size
The area that one pixel represents, measured in map units. For
example, one cell in the image may represent an area 30 30
on the ground. Sometimes called the pixel size.
checkpoint analysis
The act of using checkpoints to independently verify the degree
of accuracy of a triangulation.
circumcircle
A triangles circumscribed circle; the circle that passes through
each of the triangles three vertices.
class
A set of pixels in a GIS file that represents areas that share
some condition. Classes are usually formed through
classification of a continuous raster layer.
class value
A data file value of a thematic file that identifies a pixel as
belonging to a particular class.
Glossary
195
classification
The process of assigning the pixels of a continuous raster
image to discrete categories.
classification accuracy table
For accuracy assessment, a list of known values of reference
pixels, supported by some ground truth or other a priori
knowledge of the true class, and a list of the classified values of
the same pixels, from a classified file to be tested.
classification scheme (or classification system)
A set of target classes. The purpose of such a scheme is to
provide a framework for organizing and categorizing the
information that is extracted from the data.
clustering
Unsupervised training; the process of generating signatures
based on the natural groupings of pixels in image data when
they are plotted in spectral space.
clusters
The natural groupings of pixels when plotted in spectral space.
coefficient
One number in a matrix, or a constant in a polynomial
expression.
collinearity
A nonlinear mathematical model that photogrammetric
triangulation is based upon. Collinearity equations describe the
relationship among image coordinates, ground coordinates,
and orientation parameters.
contiguity analysis
A study of the ways in which pixels of a class are grouped
together spatially. Groups of contiguous pixels in the same
class, called raster regions, or clumps, can be identified by their
sizes and multiplied.
continuous
A term used to describe raster data layers that contain
quantitative and related values. See also continuous data.
continuous data
A type of raster data that is quantitative (measuring a
characteristic) and has related, continuous values, such as
remotely sensed images (Landsat, SPOT, and so on).
196
Glossary
contrast stretch
The process of reassigning a range of values to another range,
usually according to a linear function. Contrast stretching is
often used in displaying continuous raster layers because the
range of data file values is usually much narrower than the
range of brightness values on the display device.
convolution filtering
The process of averaging small sets of pixels across an image.
Used to change the spatial frequency characteristics of an
image.
convolution kernel
A matrix of numbers that is used to average the value of each
pixel with the values of surrounding pixels in a particular way.
The numbers in the matrix serve to weight this average towards
particular pixels.
coordinate system
A method of expressing location. In two-dimensional
coordinate systems, locations are expressed by a column and
row, also called X and Y.
correlation threshold
A value used in rectification to determine whether to accept or
discard GCPs. The threshold is an absolute value threshold
ranging from 0.000 to 1.000.
correlation windows
Windows that consist of a local neighborhood of pixels.
corresponding GCPs
The GCPs that are located in the same geographic location as
the selected GCPs, but are selected in different files.
covariance
Measures the tendencies of data file values for the same pixel,
but in different bands, to vary with each other in relation to the
means of their respective bands. These bands must be linear.
Covariance is defined as the average product of the differences
between the data file values in each band and the mean of
each band.
covariance matrix
A square matrix that contains all of the variances and
covariances within the bands in a data file.
Glossary
197
cubic convolution
Uses the data file values of sixteen pixels in a 4 4 window to
calculate an output with cubic function.
data
1. In the context of remote sensing, a computer file containing
numbers that represent a remotely sensed image, and can be
processed to display that image. 2. A collection of numbers,
strings, or facts that requires some processing before it is
meaningful.
database
A relational data structure usually used to store tabular
information. Examples of popular databases include SYBASE,
dBASE, Oracle, and INFO.
data file
A computer file that contains numbers that represent an image.
data file value
Each number in an image file. Also called file value, image file
value, DN, brightness value, pixel.
decision rule
An equation or algorithm that is used to classify image data
after signatures are created. The decision rule is used to
process the data file values based on the signature statistics.
DEM
See digital elevation model.
density
A neighborhood analysis technique that displays the number of
pixels that have the same value as the analyzed pixel in a userspecified window.
digital elevation model
Continuous raster layers in which data file values represent
elevation. DEMs are available from the USGS at 1:24,000 and
1:250,000 scale, and can be produced with terrain analysis
programs.
digital terrain model
A discrete expression of topography in a data array, consisting
of a group of planimetric coordinates (X,Y) and the elevations
of the ground points and breaklines.
198
Glossary
dimensionality
In classification, dimensionality refers to the number of layers
being classified. For example, a data file with three layers is
said to be three-dimensional.
divergence
A statistical measure of distance between two or more
signatures. Divergence can be calculated for any combination
of bands used in the classification; bands that diminish the
results of the classification can be ruled out.
diversity
A neighborhood analysis technique that displays the number of
different values in a user-specified window.
DTM
See digital terrain model.
edge detector
A convolution kernel, which is usually a zero sum kernel, that
smooths out or zeros out areas of low spatial frequency and
creates a sharp contrast where spatial frequency is high. High
spatial frequency is at the edges between homogeneous
groups of pixels.
edge enhancer
A high-frequency convolution kernel that brings out the edges
between homogeneous groups of pixels. Unlike an edge
detector, it only highlights edges; it does not eliminate other
features.
enhancement
The process of making an image more interpretable for a
particular application. Enhancement can make important
features of raw, remotely sensed data more interpretable to the
human eye.
extension
The three letters after the period in a file name that usually
identify the type of file.
extent
1. The image area to display in a view. 2. The area of the
Earths surface to map.
Glossary
199
feature collection
The process of identifying, delineating, and labeling various
types of natural and human-made phenomena from remotely
sensed images.
feature extraction
The process of studying and locating areas and objects on the
ground and deriving useful information from images.
feature space
An abstract space that is defined by spectral units (such as an
amount of electromagnetic radiation).
fiducial center
The center of an aerial photo.
fiducials
Four or eight reference markers fixed on the frame of an aerial
metric camera and visible in each exposure that are used to
compute the transformation from data file to image coordinates.
file coordinates
The location of a pixel within the file in X,Y coordinates. The
upper-left file coordinate is usually 0,0.
filtering
The removal of spatial or spectral features for data
enhancement. Convolution filtering is one method of spatial
filtering. Some texts use the terms filtering and spatial filtering
synonymously.
focal
The process of performing one of several analyses on data
values in an image file, using a process similar to convolution
filtering.
GCP
See ground control point.
GCP matching
For image-to-image rectification, a GCP selected in one image
is precisely matched to its counterpart in the other image using
the spectral characteristics of the data and the transformation
matrix.
geocorrection
The process of rectifying remotely sensed data that has
distortions due to a sensor or the curvature of the Earth.
200
Glossary
Glossary
201
202
Glossary
redefines the criteria for each class, and classifies again so that
the spectral distance patterns in the data gradually emerge.
Landsat
A series of Earth-orbiting satellites that gather MSS and TM
imagery operated by EOSAT.
layer
1. A band or channel of data. 2. A single band or set of three
bands displayed using the red, green, and blue color guns. 3. A
component of a GIS database that contains all of the data for
one theme. A layer consists of a thematic image file, and may
also include attributes.
linear
A description of a function that can be graphed as a straight
line or a series of lines. Linear equations (transformations) are
generally expressed in the form of the equation of a line or
plane. Also called 1st order.
linear contrast stretch
An enhancement technique that produces new values at
regular intervals.
linear transformation
A 1st order rectification. A linear transformation can change
location in X or Y, scale in X or Y, skew in X or Y, and rotation.
lookup table
An ordered set of numbers that is used to perform a function on
a set of input values. Lookup tables translate data file values
into brightness values to display or print an image.
low-frequency kernel
A convolution kernel that decreases spatial frequency. Also
called low-pass kernel.
LUT
See lookup table.
majority
A neighborhood analysis technique that displays the most
common value of the data file values in a user-specified
window.
map projection
A method of representing the three-dimensional spherical
surface of a planet on a two-dimensional map surface. All map
Glossary
203
204
Glossary
mosaicking
The process of piecing together images side-by-side to create
a larger image.
MSS
See multispectral scanner.
multispectral classification
The process of sorting pixels into a finite number of individual
classes, or categories of data, based on data file values in
multiple bands.
multispectral imagery
Satellite imagery with data recorded in two or more bands.
multispectral scanner
Landsat satellite data acquired in four bands with a spatial
resolution of 57 79 meters.
nadir
The area on the ground directly beneath a scanners detectors.
NDVI
See normalized difference vegetation index.
nearest neighbor
A resampling method in which the output data file value is
equal to the input pixel that has coordinates closest to the
retransformed coordinates of the output pixel.
neighborhood analysis
Any image processing technique that takes surrounding pixels
into consideration, such as convolution filtering and scanning.
NoData
NoData is what you assign to pixel values you do not want to
include in a classification or function. By assigning pixel values
NoData, they are not given a value. Images that georeference
to non-rectangles need a NoData concept for display even if
they are not classified. The values that NoData pixels are given
are understood to be just place holders.
non-directional
The process using the Sobel and Prewitt filters for edge
detection. These filters use orthogonal kernels convolved
separately with the original image and then combined.
Glossary
205
nonlinear
Describing a function that cannot be expressed as the graph of
a line or in the form of the equation of a line or plane. Nonlinear
equations usually contain expressions with exponents. Second
order (2nd order) or higher-order equations and
transformations are nonlinear.
nonlinear transformation
A 2nd order or higher rectification.
nonparametric signature
A signature for classification that is based on polygons or
rectangles that are defined in the feature space image for the
image file. There is no statistical basis for a nonparametric
signature; it is an area in a feature space image.
normalized difference vegetation index
The formula for NDVI is IR - R / IR + R, where IR stands for the
infrared portion of the electromagnetic spectrum, and R stands
for the red portion of the electromagnetic spectrum. NDVI finds
areas of vegetation in imagery.
observation
In photogrammetric triangulation, a grouping of the image
coordinates for a GCP.
off-nadir
Any point that is not directly beneath a scanners detectors, but
off to an angle. The SPOT scanner allows off-nadir viewing.
orthorectification
A form of rectification that corrects for terrain displacement and
is used if a DEM of the study area is available.
overlay
1. A function that creates a composite file containing either the
minimum or the maximum class values of the input files.
Overlay sometimes refers generically to a combination of
layers. 2. The process of displaying a classified file over the
original image to inspect the classification.
panchromatic imagery
Single-band or monochrome satellite imagery.
parallelepiped
1. A classification decision rule in which the data file values of
the candidate pixel are compared to upper and lower limits. 2.
206
Glossary
Glossary
207
208
Glossary
radiometric resolution
The dynamic range, or number of possible data file values, in
each band. This is referred to by the number of bits into which
the recorded energy is divided. See also pixel depth.
rank
A neighborhood analysis technique that displays the number of
values in a user-specified window that are less than the
analyzed value.
raster data
A data type in which thematic class values have the same
properties as interval values, except that ratio values have a
natural zero or starting point.
rational polynomial coefficients
See RPC properties.
recoding
The assignment of new values to one or more classes.
rectification
The process of making image data conform to a map projection
system. In many cases, the image must also be oriented so
that the north direction corresponds to the top of the image.
rectified coordinates
The coordinates of a pixel in a file that have been rectified,
which are extrapolated from the GCPs. Ideally, the rectified
coordinates for the GCPs are exactly equal to the reference
coordinates. Because there is often some error tolerated in the
rectification, this is not always the case.
red, green, blue
The primary additive colors that are used on most display
hardware to display imagery.
reference coordinates
The coordinates of the map or reference image to which a
source (input) image is being registered. GCPs consist of both
input coordinates and reference coordinates for each point.
reference pixels
In classification accuracy assessment, pixels for which the
correct GIS class is known from ground truth or other data. The
reference pixels can be selected by you, or randomly selected.
Glossary
209
reference plane
In a topocentric coordinate system, the tangential plane at the
center of the image on the Earth ellipsoid, on which the three
perpendicular coordinate axes are defined.
reproject
Transforms raster image data from one map projection to
another.
resampling
The process of extrapolating data file values for the pixels in a
new grid when data has been rectified or registered to another
image.
resolution
A level of precision in data.
resolution merge
The process of sharpening a lower-resolution multiband image
by merging it with a higher-resolution monochrome image.
RGB
See red, green, blue.
RGB clustering
A clustering method for 24-bit data (three 8-bit bands) that plots
pixels in three-dimensional spectral space and divides that
space into sections that are used to define clusters. The output
color scheme of an RGB-clustered image resembles that of the
input file.
RMSE
See root mean square error.
Root mean square error
The distance between the input (source) location of the GCP
and the retransformed location for the same GCP. RMS error is
calculated with a distance equation.
RPC
See rational polynomial coefficients.
RPC properties
The RPC properties uses rational polynomial coefficients to
describe the relationship between the image and the Earth's
surface at the time of image capture. You can specify the
associated RPC file to use in your geocorrection.
210
Glossary
Rubber Sheeting
The application of nonlinear rectification (2nd order or higher).
saturation
A component of IHS that represents the purity of color and also
varies linearly from 0 to 1.
scale
1. The ratio of distance on a map as related to the true distance
on the ground. 2. Cell size. 3. The processing of values through
a lookup table.
scanner
The entire data acquisition system such as the Landsat
scanner or the SPOT panchromatic scanner.
seed tool
An Image Analysis for ArcGIS feature that automatically
generates feature layer polygons of similar spectral value.
shapefile
A vector format that contains spatial data. Shapefiles have the
.shp extension.
signature
A set of statistics that defines a training sample or cluster. The
signature is used in a classification process. Each signature
corresponds to a GIS class that is created from the signatures
with a classification decision rule.
source coordinates
In the rectification process, the input coordinates.
spatial enhancement
The process of modifying the values of pixels in an image
relative to the pixels that surround them.
spatial frequency
The difference between the highest and lowest values of a
contiguous set of pixels.
spatial resolution
A measure of the smallest object that can be resolved by the
sensor, or the area on the ground represented by each pixel.
speckle noise
The light and dark pixel noise that appears in radar data.
Glossary
211
spectral distance
The distance in spectral space computed as Euclidean
distance in n-dimensions, where n is the number bands.
spectral enhancement
The process of modifying the pixels of an image based on the
original values of each pixel, independent of the values of
surrounding pixels.
spectral resolution
A measure of the smallest object that can be resolved by the
sensor, or the area on the ground represented by each pixel.
spectral space
An abstract space that is defined by spectral units (such as an
amount of electromagnetic radiation). The notion of spectral
space is used to describe enhancement and classification
techniques that compute the spectral distance between ndimensional vectors, where n is the number of bands in the
data.
SPOT
SPOT satellite sensors operate in two modesmultispectral
and panchromatic. SPOT is often referred to as the pushbroom
scanner, meaning that all scanning parts are fixed, and
scanning is accomplished by the forward motion of the
scanner. SPOT pushes 3000/6000 sensors along its orbit. This
differs from Landsat, which scans with 16 detectors
perpendicular to its orbit.
standard deviation
1. The square root of the variance of a set of values used as a
measurement of the spread of the values. 2. A neighborhood
analysis technique that displays the standard deviation of the
data file values of a user-specified window.
striping
A data error that occurs if a detector on a scanning system
goes out of adjustment; that is, it provides readings consistently
greater than or less than the other detectors for the same band
over the same ground cover.
subsetting
The process of breaking out a portion of a large image file into
one or more smaller files.
212
Glossary
sum
A neighborhood analysis technique that displays the total of the
data file values in a user-specified window.
supervised training
Any method of generating signatures for classification in which
the analyst is directly involved in the pattern recognition
process. Usually, supervised training requires the analyst to
select training samples from the data that represent patterns to
be classified.
swath width
In a satellite system, the total width of the area on the ground
covered by the scanner.
summarize areas
A common workflow progression with a feature theme
corresponding to an area of interest to summarize the change
just within a certain area.
temporal resolution
The frequency with which a sensor obtains imagery of a
particular area.
terrain analysis
The processing and graphic simulation of elevation data.
terrain data
Elevation data expressed as a series of x, y, and z values that
are either regularly or irregularly spaced.
thematic change
A feature in Image Analysis for ArcGIS that lets you compare
two thematic images of the same area captured at different
times to notice changes in vegetation, urban areas, and so on.
thematic data
Raster data that is qualitative and categorical. Thematic layers
often contain classes of related information, such as land cover,
soil type, slope, and so on.
thematic map
A map illustrating the class characterizations of a particular
spatial variable such as soils, land cover, hydrology, and so on.
thematic mapper
Landsat data acquired in seven bands with a spatial resolution
of 30 30 meters.
Glossary
213
theme
A particular type of information, such as soil type or land use,
that is represented in a layer.
threshold
A limit, or cutoff point, usually a maximum allowable amount of
error in an analysis. In classification, thresholding is the
process of identifying a maximum distance between a pixel and
the mean of the signature to which it was classified.
TM
See thematic mapper.
training
The process of defining the criteria by which patterns in image
data are recognized for the purpose of classification.
training sample
A set of pixels selected to represent a potential class. Also
called sample.
transformation matrix
A set of coefficients that is computed from GCPs, and used in
polynomial equations to convert coordinates from one system
to another. The size of the matrix depends upon the order of
the transformation.
triangulation
Establishes the geometry of the camera or sensor relative to
objects on the Earths surface.
true color
A method of displaying an image (usually from a continuous
raster layer) that retains the relationships between data file
values and represents multiple bands with separate color guns.
The image memory values from each displayed band are
translated through the function memory of the corresponding
color gun.
unsupervised training
A computer-automated method of pattern recognition in which
some parameters are specified by the user and are used to
uncover statistical patterns that are inherent in the data.
variable
1. A numeric value that is changeable, usually represented with
a letter. 2. A thematic layer. 3. One band of a multiband image.
214
Glossary
Glossary
215
216
Glossary
References
The following references were used in the creation of this book.
Akima, H., 1978, A Method for Bivariate Interpolation and Smooth
Surface Fitting for Irregularly Distributed Data Points, ACM
Transactions on Mathematical Software 4(2), pp. 148-159.
Buchanan, M.D. 1979. Effective Utilization of Color in
Multidimensional Data Presentation. Proceedings of the Society
of Photo-Optical Engineers, Vol. 199: 9-19.
Chavez, Pat S., Jr, et al. 1991. Comparison of Three Different
Methods to Merge Multiresolution and Multispectral Data:
Landsat TM and SPOT Panchromatic. Photogrammetric
Engineering & Remote Sensing, Vol. 57, No. 3: 295-303.
Conrac Corp., Conrac Division. 1980. Raster Graphics Handbook.
Covina, California: Conrac Corp.
Daily, Mike. 1983. Hue-Saturation-Intensity Split-Spectrum
Processing of Seasat Radar Imagery. Photogrammetric
Engineering& Remote Sensing, Vol. 49, No. 3: 349-355.
ERDAS 2000. ArcView Image Analysis. Atlanta, Georgia: ERDAS,
Inc.
ERDAS 1999. Field Guide. 5th ed. Atlanta: ERDAS, Inc.
ESRI 1992. Map Projections & Coordinate Management: Concepts
and Procedures. Redlands, California: ESRI, Inc.
Faust, Nickolas L. 1989. Image Enhancement. Volume 20,
Supplement 5 of Encyclopedia of Computer Science and
Technology, edited by Allen Kent and James G. Williams. New
York: Marcel Dekker, Inc.
Gonzalez, Rafael C., and Paul Wintz. 1977. Digital Image
Processing. Reading, Massachusetts: Addison-Wesley
Publishing Company.
Holcomb, Derrold W. 1993. Merging Radar and VIS/IR Imagery.
Paper submitted to the 1993 ERIM Conference, Pasadena,
California.
Hord, R. Michael. 1982. Digital Image Processing of Remotely
Sensed Data. New York. Academic Press.
References
References
217
217
218
References
References
219
220
References
Index
A
A priori
defined 207
Abstract symbol
defined 207
Accuracy assessment
defined 207
Advantages
bilinear interpolation 63
cubic convolution 65
nearest neighbor 64
American Standard Code for Information Interchange
defined 207
Analysis mask
defined 207
Ancillary data
defined 207
Annotation
defined 207
AOI
defined 207
Applying
data tools 51
GeoCorrection tools 167
Spectral enhancement 111
Arbitrary Affine
formulas 200
Area
defined 207
Area of interest
defined 207
ASCII
defined 208
Aspect
defined 208
Attribute
defined 208
Average
defined 208
B
Band
defined
Index
208
Bilinear interpolation
advantages and disadvantages 63
defined 208
Options dialog preferences 63
Bin function
defined 208
Bins
defined 208
Border
defined 208
Boundary
defined 208
Brightness inversion
overview 107
Brightness value
defined 208
Brovey Transform 94
Buffer zone
defined 209
C
Camera
overview 191
properties dialog
Camera imagery
orthorectification 43
Camera Model
tutorial 43
Camera properties
defined 209
Cartesian
defined 209
Categorize
defined 209
Cell
defined 209
Cell size
defined 209
Options dialog 61
workflow 68
Checkpoint analysis
defined 209
overview 190
Chipping parameters
offset 199
235
scale 199
Circumcircle
defined 209
Class
defined 209
Class value
defined 209
Classification
decision rules
157
Mahalanobis distance 159
maximum likelihood 158
minimum distance 157
nonparametric 149, 157
overview 149
Parallelepiped 160
parametric 149, 157
defined 210
enhanced data 151
limiting dimensions 151
nonparametric decision rule 149
overview 147
parametric decision rule 149
process 148
rectification 170
scheme 150
signatures 149
supervised training 148
supervised vs. unsupervised 151
tips 150
training 148
unsupervised training 148
236
ESRI 10
Contiguity analysis
defined 210
Continuous
defined 210
Continuous data
defined 210
Contrast stretch
defined 211
for display 99
linear 98
nonlinear 98
overview 98
piecewise linear 98
varying it 99
Conversion
overview 162
using 161
Converting
features to raster 165
raster to features 162
Convolution
example 84
filtering 84
formula 85
overview 84
using 89
workflow 88
Convolution filtering
applying 84
defined 211
Convolution kernel
defined 211
Coordinate system
defined 211
Correlation threshold
defined 211
Correlation windows
defined 211
Corresponding GCPs
defined 211
Covariance
defined 211
Covariance matrix
defined 211
Create new image
overview 72
workflow 73
Creating a shapefile
tutorial 19, 20
Index
Cubic convolution
advantages and disadvantages 65
defined 212
Options dialog preferences 63
D
Data
defined 212
Data file
defined 212
Data file value
defined 212
Data preparation
using 71
Data tools
applying 51
Data versus information 124
Database
defined 212
updating 4
Decision rule
classification 157
defined 212
Mahalanobis distance 159
maximum likelihood 158
minimum distance 157
nonparametric 149
overview 149
Parallelepiped 160
parametric 149, 157
Decision rules
nonparametric 157
DEM
defined 212
Density
defined 212
Digital elevation model
defined 212
Digital terrain model
defined 212
Dimensionality
defined 213
Dimensions
limiting for classification 151
Disadvantages
bilinear interpolation 63
cubic convolution 65
nearest neighbor 64
Divergence
Index
defined
Diversity
defined
DTM
defined
213
213
213
E
Edge detector
defined 213
Edge enhancer
defined 213
Education solutions
ERDAS 10
ESRI 10
Effects of order
polynomial equation 182
Enhanced data
classifying 151
Enhancement
defined 213
linear 98
nonlinear 98
radiometric 97
Environmental hazards
identifying 8
mapping 8
ERDAS
contacting 10
education solutions 10
ESRI
contacting 10
education solutions 10
Extension
defined 213
Extent 59
defined 213
workflow 67
Extent tab
Options dialog 60
F
Feature collection
defined 214
Feature extraction
defined 214
Feature space
defined 214
Features to raster
237
converting 165
workflow 166
Fiducial center
defined 214
Fiducials
defined 214
File coordinates
defined 214
Filtering
defined 214
Focal
defined 214
Focal analysis
overview 92
using 94
workflow 93
Geoprocessing models
updating 69
Georeferencing
defined 215
only 169
overview 169
GIS
defined 123, 215
GIS analysis
performing 123
Ground control point
defined 215
Ground control points
overview 169
Help
accessing for Image Analysis
High-frequency kernel
defined 215
overview 87
High-order polynomials 182
nonlinear 182
Histogram
defined 215
Histogram equalization
defined 215
effect on contrast 104
formula 103
overview 101
tutorial 14
using 105
workflow 104
Histogram matching
defined 215
overview 105
using 107
workflow 106
Hue
defined 112, 215
GCP
defined 214
GCP matching
defined 214
GCPs
entering coordinates 170
minimum number 187
overview 169
General tab
Options dialog 59
workflow 66
GeoCorrection
defined 214
overview 171
properties dialog
Elevation tab 174
General tab 171
Links tab 172
overview 171
Geocorrection
tutorial 43
GeoCorrection tools
applying 167
Geographic database
updating 4
Geographic information system
defined 215
overview 123
Geoprocessing
specifying options 69
tools 69
238
H
10
I
IHS to RGB
overview 115
using 117
workflow 116
IKONOS
overview 195
Index
properties dialog
Chipping tab 198
Parameters tab 197
IKONOS properties
defined 216
overview 195
Image algebra 118
Image Analysis for ArcGIS
getting help 10
Getting Started 12
performing tasks 4
quick-start tutorial 11
Image Analysis options
changes for ArcGIS 69
Image data
defined 216
Image Difference
tutorial 24
workflow 142
Image difference
overview 140
Image file
defined 216
Image info
overview 57
Image Info dialog
workflow 58
Image matching
defined 216
Image processing
defined 216
Indices
defined 216
Information versus data 124
Infrared
defined 216
Initial cluster means 153
Intensity
defined 112
IR
defined 216
Island polygons
defined 216
overview 53
ISODATA
clustering 152
defined 216
Iterative Self-Organizing Data Analysis Technique
defined 216
Index
K
Kernels
high-frequency
zero sum 86
87
L
Land cover
categorizing 5
Landsat
bands and wavelengths 201
defined 217
MSS 201
number of satellites 200
overview 200
properties dialog
Parameters tab
205
Landsat 7
data types 204
satellite 204
specifications 204
Layer
defined 217
Layer Stack
overview 143
workflow 144
layer stack 144
Linear
defined 217
Linear contrast stretch 98
defined 217
Linear transformation
defined 217
Polynomial transformation
Rubber Sheeting 189
Lookup table
defined 217
Low-frequency kernel
defined 217
example 87
LUT
defined 217
LUT Stretch
overview 98
using 101
workflow 100
180
239
M
Mahalanobis distance rules
classification 159
Majority
defined 217
Map projection
defined 217
Maximum
defined 218
Maximum likelihood
defined 218
Maximum likelihood rules
classification 158
Mean
defined 218
Median
defined 218
Minimum
defined 218
GCPs 187
Minimum distance
defined 218
Minimum distance rules
classification 157
Minority
defined 218
Modeling
defined 218
Mosaicking
defined 219
overview 78
tutorial 38
workflow 80
MSS
defined 219
Landsat 201
Multispectral classification
defined 219
Multispectral imagery
defined 219
Multispectral scanner
defined 219
N
Nadir
defined 219
National Imagery Transmission Format Standard 196
240
O
Observation
defined 220
Off-nadir
Index
defined 220
Offset
chipping parameters 199
Options dialog
Cell Size tab workflow 68
Extent tab workflow 67
General tab workflow 66
overview 59
Preferences tab 68
Orthorectification
defined 220
overview 167
rectification 171
tutorial 43
Overlay
defined 220
tutorial 48
Polygon
defined 221
Polynomial
defined 221
Polynomial equation
effects of order 182
Polynomial transformation
effects of order 182
linear 180
nonlinear 181
overview 179
transformation matrix 180
Preferences
Options dialog 62, 68
Principal components analysis
defined 222
Principal point
defined 222
Profile
defined 222
Pushbroom
defined 222
scanner 176
P
Panchromatic
SPOT 176
Panchromatic imagery
defined 220
Parallelepiped
defined 220
rules classification 160
Parameter
defined 221
Parametric decision rules
Mahalanobis distance 157
Maximum likelihood 157
minimum distance 157
Parametric rule
overview 149
Parametric signature
defined 221
Pattern recognition
defined 221
PCA
defined 221
Performing GIS analysis 123
Piecewise linear contrast stretch
defined 221
Pixel
defined 221
Pixel depth
defined 221
Pixel size
defined 221
Placing links
Index
Q
Questions about Image Analysis
finding answers 9
QuickBird
defined 222
overview 195, 196
properties dialog
Chipping tab 198
Parameters tab 197
Quick-start tutorial
Image Analysis for ArcGIS
11
R
98
Radar data
defined 222
Radiometric correction
defined 222
Radiometric Enhancement
about 97
Radiometric enhancement
defined 222
Radiometric resolution
defined 223
241
Rank
defined 223
Raster data
defined 223
Raster tab 65
Raster to feature
converting 162
workflow 164
Rational polynomial coefficients
defined 223
Recode
by class name workflow 133
by symbology workflow 135
previously grouped image workflow
Recoding
defined 223
Rectification
classification 170
disadvantages 169
georeferencing 169
georeferencing only 169
orthorectification 171
overview 168
RMS error 170
thematic files 171
triangle-based 189
Red, green, blue
defined 223
Reference coordinates
defined 223
Reference pixels
defined 223
Reference plane
defined 224
Reproject
defined 224
Reproject image
overview 81
workflow 81
Resampling
bilinear interpolation 63
cubic convolution 63
defined 224
nearest neighbor 63
Rescale Image
overview 145
workflow 146
Resolution
defined 224
Resolution merge
242
137
Brovey Transform 94
defined 224
overview 94
using 96
workflow 95
RGB
defined 224
RGB clustering
defined 224
RGB to IHS
overview 112
using 114
workflow 114
RMS error
overview 170
tolerance 170
Root mean square error
defined 224
RPC
defined 224
overview 196
properties dialog
Chipping tab 198
Parameters tab 197
RPC properties
defined 224
overview 195
RSME
defined 224
tolerance of 170
Rubber Sheeting
checkpoint analysis 190
defined 225
Linear transformation 189
nonlinear transformation 190
overview 189
triangle-based rectification 189
triangulation 189
S
Satellites
IKONOS 195
Landsat 1-5 200
Landsat 7 204
QuickBird 196
SPOT 176
SPOT 4 178
SPOT Panchromatic
SPOT XS 177
176
Index
Saturation
defined 112, 225
Scale
chipping parameters 199
defined 225
Scanner
defined 225
panchromatic 177
pushbroom 176
SPOT 176
Scheme
classification 150
Seed Radius
overview 53
workflow 56
Seed tool
controlling 52
defined 225
properties overview 52
workflow 53
Shadow
enhancing 98
Shapefile
defined 225
Signature
defined 225
overview 149
Sites
characterizing 5
Source coordinates
defined 225
Spatial enhancement
defined 225
overview 83
Spatial frequency
defined 225
Spatial resolution
defined 225
Speckle noise
defined 225
Spectral distance
defined 226
Spectral enhancement
applying 111
defined 226
Spectral resolution
defined 226
Spectral space
defined 226
SPOT
Index
defined 226
Panchromatic 176
pushbroom scanner 176
satellite overview 176
workflow 178
XS 177
SPOT 4 satellite 178
Standard deviation
defined 226
Stereoscopic
imagery 177
pairs 177
stretch
linear 98
nonlinear 98
Subset Image
overview 74
Subset image spectrally
workflow 77
Subsetting
defined 226
Subsetting an image spatially
workflow 77
Sum
defined 227
Summarize areas
overview 129
workflow 130
Supervised classification
overview 155
workflow 156
Supervised training
defined 227
overview 148
Supervised vs. unsupervised
classification
151
Swath width
defined 227
T
Tasks
performing in Image Analysis
Temporal resolution
defined 227
Terrain analysis
defined 227
Terrain data
defined 227
Thematic change
243
defined 227
overview 127
tutorial 27
workflow 128
Thematic data
defined 227
Thematic files
orthorectification 171
rectification 171
Thematic map
defined 227
Thematic mapper
defined 227
Theme
defined 228
Threshold
defined 228
Tips
classification 150
TM
defined 228
displaying data in bands 203
overview 201
Tools
applying GeoCorrection 167
Training
classification 148
defined 228
signatures 149
supervised 148
unsupervised 148
Training sample
defined 228
Transformation matrix
defined 228
Polynomial transformation 180
Transformations
high-order polynomials 182
linear 180, 189
nonlinear 181, 190
Triangle-based rectification 189
Triangulation
defined 228
Rubber Sheeting 189
True color
defined 228
Tutorial
exercises
adding images 14
applying histogram stretch
244
U
Unsupervised Classification
tutorial 29
Unsupervised classification
clusters 152
ISODATA clustering 152
overview 152
percentage unchanged 154
pixel analysis 153
workflow 155
Unsupervised training
defined 228
overview 148
Unsupervised vs. supervised classification
151
Urban growth
identifying changes 7
Using
conversion 161
convolution 89
data preparation 71
focal analysis 94
non-directional edge 91
Resolution merge 96
utilities 139
Utilities
Image Difference 140
Layer Stack 143
Rescale Image 145
using 139
14
Variable
defined 228
Vector data
defined 229
Vegetative indices
applications 117
defined 229
examples 117
Index
overview 117
Vegetative stress
identifying 9
X
XS
SPOT
177
Z
Zero sum kernels
Zoom
defined 229
Index
86
245
246
Index