Professional Documents
Culture Documents
Learning aim
To provide practice in land cover interpretation and classification of satel-
lite images
Learning objectives
After this exercise you are able to:
Duration
8 hours
Data needed
• Fieldwork photographs available as JPG from html hyperlinked image
• SPOT satellite data of Enschede, 28 July 2002 (sp28jul02rdens)
• Field point data of Enschede 2002; spjul2002.xls (spjul2002.shp)
Needed Equipment
PC with Erdas
Deliverables
For participants of Distance Education (DE) courses:
For assessment of this assignment you have to deliver a WORD Document
containing:
• screen grab of the classification (including legend)
• table showing the classification accuracies
• short explanation accompanying this table (how you obtained the ta-
ble)
• Copy of the Signature file (*.sig) used for the classification.
Procedure
The procedure of mapping land cover with satellite images consist of the
next steps:
• Interpretation of satellite image – definition of classes
• Field data collection
• Create sample set
• Classification
• Accuracy assessment
Print Satellite
Point field
Satellite image
data
Image data
Image Image
inspection legend
Image
Interpretation
Interpreted
Image
Fieldwork
Field data
Ordering
Field
Correlation
classes
Mapclasses Digital
Classification
Classified
Image
Accuracy
assessment
Matrix
Calculation
Overall
accuracy
1. Interpretation
The main purpose of a preliminary interpretation is to establish strata
(classes and subclasses and their definition) for the selection of field data
collection areas, as described under fieldwork. In this exercise we will
simulate field work by showing terrestrial photographs of selected areas
and describe the land cover and use.
2. Fieldwork
The purpose of the field survey is to observe what the different image
characteristics are in reality. In addition you would like to know if the land
cover and land use is consistently the same within an interpretation unit.
So in principle you would like to know at several places in each unit what
the land cover is and the land use. That would mean that per interpreta-
tion unit enough samples are taken to establish its contents. This will in-
crease the amount of required samples drastically. But time for fieldwork
is limited.
In most situations in the world fieldwork is the most costly part of map
making and therefore should be restricted to the minimal as much as pos-
sible. For map-making purpose of this type of thematic maps a stratified
representative sampling scheme is recommended. Stratified means that
sampling will be done per interpretation unit. In each unit samples are
taken which are representative for most of the area in the interpretation
unit.
To minimize time spent for field work, points in the field are selected ac-
cording to the method of stratified clustered representative sam-
pling, based on image characteristics.
In the field, with the help of the image, representative sample points are
selected. This means that the selected sample point has image character-
istics (e.g. gray tone, texture, and pattern) which are representative for
the legend unit in general. In the case of complex units, the sample point
should be representative for one of the components of the legend unit.
For each selected sample point, a field data sheet is filled in to collect all
basic data which are necessary to make a classification of the land use at
the sample point.
Field data
There are in total 4 field work areas; the area covered by the Aerial Pho-
tograph (also used for visual interpretation), the area around the airport,
the northeast of Enschede and the area around Aamsveen (in the south
east).
Of each area a separate html page is made showing polygons for which
the observations are valid. The direction in which the photo was taken is
indicated with an arrow. The land cover and land use at each point will be
presented in the table below. In an additional document (species.doc) the
different tree species are shown in detail.
Airport
1 grass + tarmac air strip
2 tarmac parking place
3 water drink water production
Aamsveen
1 oak trees timber production and recreation
2 birch trees nature conservation and recreation
3 heath nature conservation and recreation
4 heath + grass nature conservation and recreation
5 fern nature conservation and recreation
It does make sense to select/define classes for which you have or can col-
lect ground truth. In this case ground truth has been collected and stored
in a XLS/TXT/SHP file.
Open spjul2002 and analyse the classes that have been used and make
up your mind. If required adapt the tables above to match your class
definition (e.g. group classes together that share spectral properties
and are not distinctively defined in the ground truth). Remember that
without proper accuracy assessment using reliable ground truth a clas-
sification is useless (for anyone else but you).
Make a list of classes you would like to classify on the image and
indicate which colour it is on the image.
3. Supervised Classification
With the list of classes made above you can now start to classify the im-
age accordingly.
This section shows how the Supervised Classification tools allow you to
control the classification process. You perform the following operations in
this section:
The ERDAS IMAGINE Signature Editor allows you to create, manage, eval-
uate and edit signatures (.sig extension).
In this section, you define the signatures using the area of interest (AOI)
tools. ERDAS has additional tools to collect signatures.
Preparation
Select File | Open | Raster Layer from the Viewer menu bar or click
the Open icon on the Viewer toolbar to display the image file to be
classified.
Set Layers to
3, 4, 1
Click here to
display the image
Select to
Fit to Frame
Click the Raster Options tab at the top of the dialog, and then set the
Layers to Colors to 3, 4 and 1 (red, green, and blue respectively).
Layer 3 is the NIR band , layer 4 is MIR band 4 and 1 is the Visual
Green band 1.
In the Classification menu, click Close to remove this menu from the
screen.
These rows
should not
be selected
These are the CellArray columns in the Signature Editor that you remove
to make it easier to use. These columns can be returned at any time.
In the View Signature Columns dialog, click Apply. The Red, Green,
and Blue columns are deleted from the Signature Editor.
The AOI tools allow you to select the areas in an image to be used as a
signature. These signatures are parametric because they have statistical
information.
Select AOI | Tools from the Viewer menu bar. The AOI tool palette
displays.
Use the Zoom In tool on the Viewer toolbar to zoom in on one of the
bright red areas in the sp28jul02rdens.img file in the Viewer.
In the Viewer, draw a polygon around the red area you just magnified.
Click and drag to draw the polygon and click to draw the vertices. Mid-
dle-click or double-click to close the polygon (depending on what is set
in Session | Preferences).
After the AOI is created, a bounding box surrounds the polygon, indicating
that it is currently selected. These areas are covered with maize in July
2002.
In the Signature Editor, click the Create New Signature(s) from AOI
icon or select Edit | Add from the menu bar to add this AOI as a sig-
nature.
In the Signature Editor, click inside the Signature Name column for
the signature you just added. Change the name to Maize, and then
press Enter on the keyboard.
In the Signature Editor, hold in the Color column next to Maize and
select Cyan. For evaluation purposes it is more suitable to select oppo-
site colors from the color of the cover on the image. In the end stage
after classification the colors can be changed for visualization purposes
and map production.
Under count you see the number of pixels, which are sampled. In order to
apply later on the maximum likelihood classifier, theoretically you need
n+1 pixels for each class (n = no. of bands). It is recommended to have
at least n x 10 pixels sampled, better even 100 x n, by preference from
several distinct areas.
In the Signature Editor, click inside the Signature Name column for
the signature you just added. Change the name to Water, then press
Enter on the keyboard.
In the Signature Editor, hold in the Colour column next to Water and
select White.
Repeat this for all cover classes. At least add the light yellowish green ar-
eas in the image. These are bare agricultural fields (e.g. photo 21 and 23
of the Aerial photo show how such fields look in reality). Give it the name
Bare agri-field and give it the colour Red.
Make sure that the samples are in homogenous areas and that the stand-
ard deviation is kept low. You can see the statistics by View | Statistics. It
is also recommended to make different signatures of the same class,
which appear in different colours in the image, and assign different names
(e.g. Grass 1, Grass 2). Store them under a different name. After classifi-
cation these classes can be combined using recoding, or you could assign
to them the same Value. Merging the classes in the same sample set in
the signature editor will cause statistics with larger standard deviations
and will lead to overlap of the feature classes, which will result in more
errors in classification.
In the Supervised
Classification dialog,
leave the Non-
parametric Rule popup
list selected to None.
And select Maximum
likelihood for Para-
metric Rule.
You do not need to use the Classify Zeros option here because there are
no background zero data file values in the sp28jul02rdens.img file.
When the process is 100% complete, click OK in the Job Status dialog.
The super classification image is pictured on the left, and the distance im-
age is pictured on the right.
See the chapter "Classification" of the ERDAS Field Guide for information
on how the pixels are classified.
Next some evaluation tools of the sample set will be examined. Usually
you do this before the actual classification. Many tools exist in ERDAS and
other software to assist you in analysing how good (read e.g. separable)
your samples are. Assessment of the accuracy of the actual classification
is done in a different way.
The Signature Alarm utility highlights the pixels in the Viewer that belong
to, or are estimated to belong to a class according to the (parametric)
parallelepiped decision rule. An alarm can be performed with one or more
signatures. If you do not have any signatures selected, then the active
signature, which is next to the >, is used.
In the Signature Editor, select Maize by clicking in the > column for
that signature. The alarm is performed with this signature.
The Signature Alarm utility allows you to define the parallelepiped limits
by either:
If you wish, you can set new parallelepiped limits and click OK in the
Set Parallelepiped Limits dialog, or simply accept the default limits by
clicking OK in the Set Parallelepiped Limits dialog.
Click Close in the Limits dialog. In the Signature Alarm dialog, click OK.
The alarmed pixels display in the Viewer in cyan. You can use the toggle
function (Utility | Flicker) in the Viewer to see how the pixels are classi-
fied by the alarm.
! Be sure that there are no AOI layers open on top of the Alarm Mask Layer.
You can use View | Arrange Layers to move any AOI layers present in
the Viewer.
You repeat this for all other signatures. You can also select more
signatures at the same time by selecting the rows in the signature edi-
tor. The selected rows will turn yellow. In this way you can also see
which features in the images have not been sampled yet.
With many classes you can better use the swipe function (Utility |
Swipe) in the Viewer to see how the pixels are classified by the alarm.
In the Arrange Layers dialog, right-hold over the Alarm Mask button
and select Delete Layer from the Layer Options menu.
The Signature Objects dialog allows you to view graphs of signature statis-
tics so that you can compare signatures. The graphs display as sets of el-
lipses in a Feature Space image (two-dimensional histogram).
Select sp28jul02rdens.img
Click here to
output to a Viewer
Click
sp28jul02rdens _1_3.fsp.img
In the Create Feature Space Images dialog under Input Raster Layer,
enter sp28jul02rdens.img.
This is the image file from which the Feature Space image is generated.
! Verify that the directory where the Feature Space image files are saved
has write permission.
In the Create Feature Space Images dialog, click the Output to Viewer
option so that the Feature Space image is displayed in a Viewer.
The output Feature Space image is based on layers one and three of the
sp28jul02rdens.img file. Layers one and three are selected since the
selected features are spectrally distinct in this band combination.
The Create Feature Space Images dialog closes, and then the Job Status
dialog opens.
The Signature Objects dialog allows you to view graphs of signature statis-
tics so that you can compare signatures. The graphs display as sets of el-
lipses in a Feature Space image (two-dimensional histogram). Each ellipse
is based on the mean and standard deviation of one signature. A graph
can be generated for one or more signatures. If you do not have any sig-
natures selected, then the active signature, which is next to the >, is
used. This utility also allows you to show the mean for the signature for
the two bands, an ellipse, and a label.
In the Signature Editor, select the signatures for Maize, Water and
Bare agri-field by clicking in the Class row for Maize and Shift-
clicking in the Class row for Water and Bare agri-field.
In the Signature Objects dialog, confirm that the Viewer number field is
set for 2.
Enter 2 here
Enter 4 here
Click OK in the
Signature Objects
dialog.
By comparing the ellipses for different signatures for a one band pair,
you can easily see if the signatures represent similar groups of pixels
by seeing where the ellipses overlap on the Feature Space image.
In this example the ellipses do not overlap, as only three and very distinct
features were chosen.
Arrange Layers
Now that you have the parametric signatures collected, you do not
need the AOIs in the Viewer. Select View | Arrange Layers from the
Viewer menu bar. The Arrange Layers dialog opens.
In the Arrange Layers dialog, right-hold over the AOI Layer button
and select Delete Layer from the AOI Options menu.
Click Apply in the Arrange Layers dialog to delete the AOI layer.
You are asked if you want to save the changes before closing. Click
Yes. And enter a name in the correct directory. Click OK
Recode Classes
After the classification, you may want to recode the thematic raster layer
to assign a new class value number to any or all classes, creating a new
thematic raster layer using the new class numbers. Use the Recode func-
tion under Interpreter (icon) | GIS Analysis to recode a thematic raster
layer. If the Class names then disappear, display the file in a Viewer, Se-
lect Raster | Attributes and in the Attributes dialogue window select Edit |
Add Class Names and retype them.
! To prevent recoding you can also edit the Value column in the Signature
editor and associate several Classes with the same Value. You would have
to run the Classification again of course after making these changes in the
editor. This will circumvent the Recode operation.
4. Accuracy assessment
The collected field data can be entered in a spreadsheet. The EXCEL sheet
should at least contain X Y and cover. Add a column with the class number
for the landcover that corresponds with the classification numbers. Delet-
ed the first (header) row and save the file as Text (Tab delimited). This
table will be used as input in the accuracy assessment.
Preparation
Click the Classifier icon in the ERDAS IMAGINE icon panel.
The Classification menu displays.
! See the chapter "Classification" of the ERDAS Field Guide for the proce-
dure with random points.
In this case we will use the collected field data saved in a Text file.
In the Accuracy
Assessment dialog,
select Edit | Import
User-defined
Points. Select the
Text file and press OK
With the input preview and view you can see how the table will be import-
ed. Check carefully whether your settings (on the Field Definition tab and
in the Column Mapping table) are correct by using the Preview tab.
Click OK and the X and Y are added in the accuracy assessment table.
The corresponding class names are not imported.
Since the ground truth table contains class names and not labels (num-
bers) you first have to convert these in EXCEL to the appropriate class
labels that you have selected in ERDAS in the signature editor(Value col-
umn in the Signature Editor).
See the chapter "Classification" of the ERDAS Field Guide for information
on the error matrix, accuracy totals, and Kappa statistics.
If you like, you can save the accuracy assessment reports to text files.
For more information on which classes can be improved you can also ob-
tain the report in an error matrix.
Select File | Close from the menu bars of the ERDAS IMAGINE Text
Editors.
Select File | Close from the Accuracy Assessment dialog menu bar.
If you are satisfied with the accuracy of the classification, select File |
Close from the Viewer menu bar.
If you are not satisfied with the accuracy of the classification, you can fur-
ther analyze the signatures and classes using methods discussed in this
tour guide. You can also use the thematic raster layer in various ERDAS
IMAGINE utilities, including the Image Interpreter, Raster Editor, and Spa-
tial Modeler to modify the file.
You can also reduce the amount of classes. Here the classes are reduced
to five classes by recoding. Also the point classes are recoded to 5 clas-
ses. Deciduous and evergreen forest can be combined in forest. All agri-
culture classes (bare agr, maize and grass) can be combined in agricul-
ture. And bare and building in build.
! You will notice that in the ground truth for accuracy assessment some
classes are not represented adequately. Given the assessment procedure
as developed by ERDAS (random point locations for which you would have
to identify the ground truth) it is possible that classes with limited cover-
age get few selected ground truth samples. The accuracy for Class Water
can now only be 0% no sample, 50% 1 of the 2 samples or 100% both
samples correctly classified. Usually one would identify similar areas (not
the same!) as the ones for training the classifier and assess the cross cor-
relation between the Ground truth areas and the classification. This imple-
mentation in ERDAS goes beyond the scope of this exercise. One would use
Annotation to draw the polygons, convert this to a raster file and use the
Summary tool from Interpreter | GIS Utilities.
If time allows, use the Map Composer chapter of the ERDAS Tour Guide
and prepare a concept thematic map of your classification result.