You are on page 1of 34

Remote Sensing Mapping Techniques

Magaly Koch, PhD

Center for Remote Sensing


Boston University, USA

mkoch@bu.edu
Lineaments

Fractures and faults are essentially linear features and


therefore can be identified by visual interpretation of images.
The true nature of such linear elements may not be known
during the interpretation and therefore it is preferable to speak
of lineaments. These can be defined as non-human made,
more or less linear features, possibly associated with faults and
large fractures.

In digital image processing lineaments can be better identified


and mapped by applying an image processing function called
edge enhancement or sharpening. The next slides illustrate how
spatial filters work in remotely sensed images.
Spatial Filtering

Spatial filtering encompasses another set of digital processing functions which


are used to enhance the appearance of an image. Spatial filters are designed to
highlight or suppress specific features in an image based on their spatial
frequency. Spatial frequency is related to the concept of image texture.
Rough textured areas of an image, where the changes in tone are abrupt over a
small area, have high spatial frequencies.
While smooth areas with little variation in tone over several pixels, have low
spatial frequencies.
In contrast to spectral filters, which serve to block or pass energy over various
spectral ranges, spatial filters emphasize or deemphasize image data of various
spatial frequencies.

Low High
frequency frequency
Image Convolution
Spatial filtering is a local operation in that pixel values in an original image are
modified on the basis of the gray levels of neighboring pixels. This is done by
moving a window or kernel over each pixel in the image, applying a mathematical
calculation using the pixel values under that window and replacing the central pixel
with the new value. The window is moved along in both the row and column
directions one pixel at a time and the calculation is repeated until the entire image
has been filtered and a new image has been generated. By varying the calculation
performed and the weightings of the individual pixels in the filtered window, filters
can be designed to enhance or suppress different types of features. This procedure
is also called image convolution.
Convolution Matrix
Image DN value
Kernel 3 3 4 4 5 Target Pixel
1 1 1 2 3 3 4 4
1 1 1 1 2 2 3 3 Target Neighborhood
1 1 1 1 1 2 4 4
1 2 3 20 20 Image DN value
Kernel sizes:
Kernel
3x3, 5x5, 7x7 3 3 4 4 5
etc.
1 1 1 1x3 1x3 1x4 2 3 3 4 4
1 1 1 1x2 1x3 1x3 1 2 2 3 3
1 1 1 1x1 1x2 1x2 1 1 2 4 4
Sum(Kernel x Image Neighborhood) 1 2 3 20 20

=
23 = 2.56
Sum(Kernel) 9
3 3 4 4 5 3 3 4 4 5 3 3 4 4 5 3 3 4 4 5

2 3 3 4 4 2 3 3 4 4 2 3 3 4 4 2 3 3 4 4

1 2 2 3 3 1 2 2 3 3 1 2 2 3 3 1 2 2 3 3

1 1 2 4 4 1 1 2 4 4 1 1 2 4 4 1 1 2 4 4

1 2 3 20 2 1 2 3 20 2 1 2 3 20 2 1 2 3 20 2
Directional Filters

Arrows indicate
the direction of
linear features
enhanced by edge
enhancement
filter.
Contrast along
linear feature
increases as well
as its width.
TM band 5 of a part of the central Lineament interpretation showing mainly
Ethiopian Rift, with edge enhancement tensional rift faults.
filter, which sharpens contrast only a little.
Distance E-W is 37 km.
Data Fusion

Radar images are different from that of Multi-spectral images. Therefore they contain
complementary information highlighting different aspects of the landforms and
outcropping rocks.

The combination of radar and multi-spectral images, also termed data fusion requires
that both have the same geometric properties, in other words, geocoding has to be done
to ensure that pixel sizes and coordinate systems are the same. This requires resampling
of at least one of the images, usually the radar image. To make a radar image of hilly
terrain suitable for fusion, a good DEM has to be available to place the radar pixels into
the proper planimetric position because of relief distortion. Examples of fusion of radar
and multi-spectral images are shown in the next slides. After fusing both images types a
land cover classification was performed.
ASTER Image PALSAR Image

West of Aswan, Egypt


Multispectral optical data
with radar imagery

Classified Image
A = agriculture
B = sandstone
C = limestone
D = basalt
E = eolian sed.
E F = structures

RADARSAT with Azraq Watershed,


Landsat TM Jordan
Intensity, Hue and Saturation Transform

A color of any pixel can be represented by two models, the


RGB color model or the Intensity, Hue and Saturation (IHS)
color model. The IHS model uses three criteria:

1. Intensity, representing the brightness value (with a


sensitivity determined by the radiometric resolution);
2. Hue, indicating the dominant wavelength of color;
3. Saturation, representing the degree of purity of color
(may be considered to be the amount of white mixed in
with the color).

These values together are called the IHS system and can be
represented as a cylinder with the main axis measuring intensity,
distance from the center measuring saturation, and position along
the rim indicating hue.
RGB and IHS Color Models
Intensity, Hue and Saturation Transform

IHS transform is useful in two ways:


1. As a method of image enhancement; IHS values are
separated from an image, enhanced separately and then
recombined for a clearer composite;
2. As a means of combining co-registered images from
different sources (TM imagery and radar or aerial
photographs); this method is called data fusion.

IHS data fusion is performed following three basis steps:


1. Register the high-resolution image (e.g., panchromatic
band) and multispectral image (e.g., Landsat TM);
2. Convert the TM image from RGB to IHS coordinates;
3. Substitute the pan-band for the intensity coordinate, and
4. Convert back to RGB space.
Spectral Ratioing

The process of dividing the pixels in one image or band by the corresponding
pixels in a second image or band is known as image or band ratioing. Ratio
images serve to highlight subtle variations in the spectral responses of various
surface covers. By ratioing the data from two different spectral bands, the
resultant image enhances variations in the slopes of the spectral reflectance
curves between two different spectral ranges that may otherwise be masked by
the pixel brightness variations in each of the bands. Useful for differentiating
between areas of stressed and non-stressed vegetation; discriminating rock types
etc.
Spectral Ratioing
Spectral Ratioing

Other common ratios for discrimination of soil and rock


components of the surface are:
TM Band Ratio 5/7 → Clays, Carbonates
TM 5/1 → Mafic Igneous Rocks
TM Band Ratio 3/1 → Iron Oxides
Advanced Supervised Classification Methods
Airborne Hyperspectral Sensor

DAIS 7915 hyperspectral sensor


• Hyperspectral - 79 bands
• 4 spectrometers
• 5 m resolution (3300 m)
• Swath width 3 km
Spectrometer Bands Wavelength (µm)
1 VIS/NIR 32 0.5 - 1.05
2 SWIR I 8 1.5 - 1.8
3 SWIR II 32 1.9 - 2.5
MIR 1 3.0 - 5.0
4 TIR 4 8.7 - 12.5
Airborne Hyperspectral Sensor

Location of DAIS wavebands along the solar spectrum

Spectrometer Bands Wavelength (µm)


1 VIS/NIR 32 0.5 - 1.05
2 SWIR I 8 1.5 - 1.8
3 SWIR II 32 1.9 - 2.5
MIR 1 3.0 - 5.0
4 TIR 4 8.7 - 12.5
Field Campaign: Spectral Sampling
Spectral Curve Matching
Field Spectrometer

Ground Truthing

Spectral Library
Spectral Feature Extraction Across Scales

ETM+ Bands 7, 4, 2 DAIS Bands 76,50, 25 Ground Photo

Satellite Multispectral Sensor Airborne Hyperspectral Sensor Field Spectroradiometer


Spectral Feature Extraction Across Scales

ETM+ Bands 7, 4, 2 DAIS Bands 76,50, 25 Ground Photo

Satellite Multispectral Sensor Airborne Hyperspectral Sensor Field Spectroradiometer


Construction of a Spectral Library with Field Data

Saline soil B - moderately saline


Texture Clay loam halite (%) 8.0
Color (Dry) 10YR6/2 gypsum (%) 57.9
Color (Moist) 10YR4/2 starkeyite (%) 3.0
Organic matter (%) 3.6 hexadrite (%) 11.3
pH (1:2.5 H 2O) 8.9 calcite (%) 5.6
EC (dS/m) 12.1 quartz (%) 7.1
carbonates (%) 9.0 phyllosilicates (%) 7.1 (S)
Fe2O3 (%) 0.12
Hyperspectral Image Cube
Spectral Curve Matching

Band b
Band a
Spectral Angle Mapper

• The SAM-Method is based on the


estimation of the spectral similarity
in the (n-dimensional) feature space
Pixel
• Signature is described by a vector,
starting at the coordinate system’s
origin

Band b
• Length of the vector resembles
Reference-
reflection intensity angle
spectrum
• Difference between spectra is
described by the angle
• By assessing the angle difference
between pixel and reference
spectrum an image can be divided in Band a
any number of classes
Spectral Angle Mapper

Spectral Angle Mapper (SAM) is an automated method for comparing image spectra to individual
spectra or to a spectral library. The algorithm determines the similarity between two spectra by
calculating the spectral angle between them, treating them as vectors in n-D space, where n is the
number of bands.

Two different materials are represented in a 2D scatter plot by a point for each given illumination, or as a
line (vector) for all possible illuminations. Because SAM uses only the direction of the spectra, not the
length, SAM is insensitive to variations in illumination. Poorly illuminated pixels fall closer to the origin of
the scatter plot. The color of a material is defined by the direction of its unit vector. SAM determines the
similarity of an unknown spectrum to a reference spectrum.
Application: Mineral Mapping

Northern Death Valley: The SAM image is color coded


Hyperion true color image. as follows: red=calcite,
The geology consists yellow=dolomite,
principally of a Jurassic-age green=muscovite#1,
intrusion exhibiting quartz- blue=muscovite#2,
sericitepyrite Spectral library brown=muscovite#3,
hydrothermal alteration. cyan=zeolite, purple=silica.
Application: Land Cover Mapping
Readings & Other Resources

• Remote Predictive Mapping 3. Optical Remote Sensing – A Review


for Remote Predictive Geological Mapping in Northern Canada
https://earthobservatory.nasa.gov/features/FalseColor/page1.php

• Using spectral imaging to map surface geology and mineralogy


https://www.energymining.sa.gov.au/minerals/geoscience/geoscientific_
data/remote_sensing

• What is remote sensing and what is it used for?


https://www.usgs.gov/faqs/what-remote-sensing-and-what-it-used?qt-
news_science_products=0#qt-news_science_products
THANK YOU AND QUESTIONS?

You might also like