You are on page 1of 26

02-08-2015

Raghunath Jha

Raghunath Jha

Introduction to Remote Sensing


& Image Processing
Raghunath Jha

Raghunath Jha

1
02-08-2015

Introduction
RS is the collection of information about an object
without direct contact (from a distance) "
• The analogue unit of data collection is the photograph (aerial or
from space), from a camera.
• The digital unit is the pixel, created by using a scanner.
• Scale is a function of distance from the object, system quality and
resolution.
• Analogue or digital systems can generate various images along the
electromagnetic spectrum.
• All photographs are 'images' but digital images are not photographs.
• A digital image processing system must be RASTER, but may also
have vector capabilities.
• VECTOR systems may have some raster options, such as image
display.
• Traditional uses of remote sensing are interpretation, location &
updating
• Digital applications are classification & feature extraction

Raghunath Jha

Milestones In The History Of


Remote Sensing
1839 Invention of photography
1910s First use of aerial photography (World War I: photo interpretation)
1920s Development of photogrammetry for mapping
1940s Military use of radar (World War II)
1950s Use of colour photography and infra-red
1962 Term 'remote sensing' first appeared
1970s Launch of first weather satellites (Nimbus, Tiros)
1972 Launch of Landsat 1 (named ERTS-1) and multispectral sensor (MSS)
1982 Landsat 4 and 'the next generation sensor': Thematic Mapper (TM)
1985 Unix workstations and improved PCs enabling widespread use of digital imagery and GIS
1986 SPOT-1 satellite (France)
1990s Other satellites: e.g. India, Japan, USSR; airborne spectrometers (e.g. CASI)
1995 Other satellites: e.g. India, Japan, USSR; airborne spectrometers (e.g. CASI)
2000 (?) High resolution private sector satellites

Raghunath Jha

2
02-08-2015

Launch Schedule of Different


Satellites
http://www.itc.nl/research/products/s
ensordb/Launch_Schedule.aspx

Raghunath Jha

Raghunath Jha

3
02-08-2015

Raghunath Jha

Electro-Magnetic Radiation
• The Electro-Magnetic Radiation (EMR), which is
reflected or emitted from an object, is the usual source
of Remote Sensing data.

Remote Sensing Technology makes use of the wide
range Electro-Magnetic Spectrum (EMS) from a very
short wave "Gamma Ray" to a very long 'Radio Wave'.

Wavelength regions of electro-magnetic radiation have


different names ranging from Gamma ray, X-ray,
Ultraviolet (UV), Visible light, Infrared (IR) to Radio
Wave, in order from the shorter wavelengths.

Raghunath Jha

4
02-08-2015

Raghunath Jha

Raghunath Jha

5
02-08-2015

optical wavelength region

• The optical wavelength region, an important


region for remote sensing applications, is
further subdivided as follows:

Raghunath Jha

Raghunath Jha

6
02-08-2015

Raghunath Jha

Raghunath Jha

7
02-08-2015

Platforms & Sensors

• Platform: the satellite carrying the remote


sensing device

• Sensor: the remote sensing device


recording wavelengths of energy
.

Raghunath Jha

Raghunath Jha

8
02-08-2015

Spectral Signatures
• Spectral signature graphs, show the relative
amount of reflection or emission from an object
across different wavelengths. Every object
varies in the amount of energy reflection,
otherwise for example they would all appear to
be black, white or a shade of gray on colour film
if there were equal reflection in red, green and
blue (RGB) wavelengths.

Raghunath Jha

Types of Satellites
• Satellite orbits can be one of two kinds:
a. Sun-synchronous: the satellite passes over &
captures imagery at the same time of day.
Satellites & sensors designed for terrestrial
mapping & earth resource monitoring are
generally sun-synchronous

• b. Geostationary: the satellite orbits with the


earth & is permanently over the same location.
weather, communicatin, satellites are
geostationary

Raghunath Jha

9
02-08-2015

Resolutions in RS
• Spectral Resolution (how many Bands of
data)
• Radiometric resolution (How many bits of
data)
• Spatial Resolution (distance on the earth)
• Temporal resolution (Time after the image
will taken again)

Raghunath Jha

Raghunath Jha

10
02-08-2015

Digital Data Formats & Systems

Raster data
• Scanner input signal
• Signal quantification
• Mapping a continuous value into a
discrete digital value
• Digital grid/array arrangement of images
• Values in the image

Raghunath Jha

Raghunath Jha

11
02-08-2015

Raghunath Jha

Raghunath Jha

12
02-08-2015

Satellite Spectral Radiometric Spatial Temporal


name
Landsat TM 7 bands 8 bit 30m 16 days
IRS 4 bands 8 bit 24m 24 days
ADEOS 4 bands 8bit 16m 41days
IRS1D 4 bands 8 bit 5.8m 60 days
IRS cartosat 1band 10 bit < 1.0m 126 days
II stereo
Eknos 4 bands 8 bit 1m (4m 6 months
color)
Quick Bird 4 bands 8bit .65 m( 2.5m 6 months
color
Geo Eyes 4 band 8 bit .5m 6 months
NOAA 5 bands 8bit 1.1km 12 hours
Modis 2 bands 8 bit 250m 1day
Raghunath Jha

Data Storage Formats

• Band Sequential (BSQ)


• Band Interleaved by Line (BIL)
• Band Interleaved by Pixel (BIP)
• Run-length encoding
• Desktop formats: tiff, gif, jpeg, pbm, pcx,
sun raster, tga, xpm (X Window

Raghunath Jha

13
02-08-2015

IMAGE PROCESSING

More detail during Training of ERDAS Imagining

Raghunath Jha

Digital Image Processing for RS


• Image Acquisition
• Image Registering (georeferencing)
• Image corrections
• Image Enhancement
• Image interpretation (classification)
• conversion

Raghunath Jha

14
02-08-2015

Radiometric Correction

• Radiometric correction is used to modify


DN values in order to account for noise, that
is contributions to the DN that are a
function NOT of the feature being sensed
but of the atmosphere or the sensor itself.
(also referred to as "pre-processing")
• This correction is done before delivering
data.

Raghunath Jha

Geometric Correction

• Also referred to as (geo)rectification, most


remote sensing data contain distortions
preventing overlay with other GIS layers.
While aerial photographs have many
sources of error, the main source of
geometric error in satellite data is due to
satellite path orientation.

Raghunath Jha

15
02-08-2015

Sources of Geometric Error


Systematic distortions
• Scan skew: the forward motion of platform during each
mirror sweep, resulting in the ground swath not being
normal to the polar axis Mirror-scan Velocity and
panoramic distortion: along-scan distortion (pixels at edge
are slightly larger)
• Earth rotation: earth rotates during scanning (offset of
rows).... (122 pixels per Landsat scene)
Non-systematic distortions
• Altitude and attitude variations in satellite; topographic
elevation

Raghunath Jha

Geocorrection process
• The geocorrection process consists of two
steps: rectification and resampling.

Raghunath Jha

16
02-08-2015

Rectification
• Data pixels must be related to exact ground locations, most commonly
measured in UTM coordinates,which can be used to: correct data by
position, register different date imagery, register different resolutions.

General procedure:
a. Identify known Ground Control Points (GCPs), normally easily
identified points on ground/map and image, e.g. road intersections
(should be stable, definite and well spaced)
b. Compare and tabulate pixel/row on image with known location on
ground, co-ordinate system usually is UTM
c. These locations are submitted to a Least Squares Regression.

Raghunath Jha

Resampling
Locations are fitted to a new grid based on map coordinates, using round
values, which may require a new pixel size (to fit with UTM
system)
:e.g. MSS 80m -> 50 m, TM 30m -> 25m IRS 5.8m -> 5m
a. Nearest Neighbour
Pixel in new grid acquires the value of closest from old grid : the easiest to
compute, retains original DNs
Disadvantage: image may look blocky, and features up to 0.5 pixels off

b. Bilinear Interpolation
New pixel gets a value according to the distance weighted average of 4 (2
x 2) nearest pixels; takes longer processing time
Looks smoother, (but creates synthetic DNs, different from original
numbers)

c. Cubic Convolution
New pixel values are computed from weighting 16 (4 x 4) surrounding
numbers; smoothest image, but longest computing time and DNs
more synthetic.

Raghunath Jha

17
02-08-2015

Digital Image Processing for RS


• Image Acquisition
• Image Registering (georeferencing)
• Image corrections
• Image Enhancement
• Image interpretation (classification)
• conversion

Raghunath Jha

Bands, Channels, Image Planes &


RGB Guns
• Bands : are captured (scanned) by the
sensor.
Channels: are bands stored in a database:
no limit !
Image planes: are where the channels are
loaded for display; best limited to 8
RGB: are the three colour guns available
for display.

Raghunath Jha

18
02-08-2015

Display Modes
A monitor has 3 guns (RGB), so only 3
channels can be displayed at once.
• Three different channels compose a colour
composite.
• The same one channel in all three guns creates a
grayscale image.
• One channel can also be displayed in
pseudocolour (PC).
• Density slice: certain DNs are classed or
thresholded into a colour.

Raghunath Jha

Image Enhancement
Statistical summary
• Image histograms
• Histogram transforms
– Linear stretch

– Histogram Equalization enhancement


Data are partitioned into DNrange classes such that an equal number of
pixels fall into each class. Greatest contrast is seen among pixels with
the greatest frequency of occurrence in the image
– Root/Logarithmic enhancement
Useful for skewed Gaussian distributed DNs. The transfer function is
logarithmic in shape
– Piecewise linear stretch
• Density slicing & pseudo-colour enhancements
• DN thresholding

Raghunath Jha

19
02-08-2015

Display Considerations

• Most data are acquired in 8 bit values (0-255).


• The data in each band rarely fill the 0-255 range
• 8 bit display enables every DN for one band to
be displayed in contrast
• 24 bit display enables every DN for three bands
to be displayed
• 8 bit display for 3 channels requires grouping
display values using a Look Up Table (LUT)

Raghunath Jha

Image Enhancement & Filters


Spatial Filters
A common practice in digital image processing is to
enhance imagery with the use of spatial filters.
Filters are commonly used for such things as
edge enhancement, noise removal, and the
smoothing of high frequency data. These filters
work by enhancing or suppressing spatial detail
to improve visual interpretation in the final
image
Low Pass filter
High pass filter
Edge detection filter
Lot more of other type in new softwares

Raghunath Jha

20
02-08-2015

How filters work


• Spatial filtering works by passing a two-dimensional rectangular array
of weighted values over each pixel in a digital image. The pixel in the
center of the array is evaluated and recalculated according to the
average of itself and the surrounding pixels, which are weighted by the
values in the filter array. The numbers are then averaged and the
middle value is changed to this average value. The array then shifts
over to the next cell and performs the same operation on the following
center pixel. This process of evaluation is called a two dimensional
convolution, and the filter is often called a convolution kernel.
• Most Filters are local area or clique functions that operate in the
spatial domain.
• A clique or moving window function defines a small sub-window with
dimensions of 3x3 or larger, and usually with odd-numbered
dimensions (e.g. 3X3, 5X5, 7X7 9X9 etc.) An example of a 3X3
window is shown below, with pixels numbered from the top-left:

Raghunath Jha

Example of filter
 A simple mean value filter just uses a
convolution kernel where all the values
equal where p is the number of pixels in the
moving window (e.g. for a 3X3 window
there are 9 pixels in the window). Here is an
example of the math involved for a mean
value 3X3 filter:

Raghunath Jha

21
02-08-2015

Band Ratios & Indices


• Band ratioing is the simplest of multispectral
techniques, and could be considered a type of
GIS 'overlay'
• A band ratio is a new channel of data created by
the division of two sets of band digital numbers
for each pixel
New ratio channel DN= a * (DN band x / DN
band y ) (where 'a' is a numerical scaling factor)

Raghunath Jha

Image Arithmetic
Band ratios are the result of 'division'; it is also possible to use
the other arithmetic operators:
a. Image subtraction
Yields the difference between two bands; the result will
include values that are + and - (requiring scaling or a 16
bit signed channel): useful for showing changes through
time with two or more image dates.
b. Image addition
Used to create an overall or average image channel, e.g. (1 +
2 + 3) / 3
c. Image multiplication
Often used in a masking process, where one layer is either 1
or 0 (e.g. land or water)
Raghunath Jha

22
02-08-2015

Indices
An index involves a 'normalised difference', which
compensates for additive effects as well as multiplicative
effects negated by ratioing. The most common is the
Normalised Difference Vegetation Index (NDVI):
NDVI= (IR - Red)/(IR + Red) [For TM: (TM4-
TM3)/(TM4 + TM3)]
This would yield values between 0 and 1, necessitating a
32 bit channel or scaling (multiplying by 255)
NDVI is used extensively to present a measure of
vegetation amount or biomass, especially in regional and
global estimates.
Other 'normalised indices' include:
NDSI= (TM2-TM5) / (TM2+TM5) (S = Snow)
NDGI= (TM4-TM2) / (TM4+TM2) (G = green)

Raghunath Jha

Digital Image Processing for RS


• Image Acquisition
• Image Registering (georeferencing)
• Image corrections
• Image Enhancement
• Image interpretation (classification)
• conversion

Raghunath Jha

23
02-08-2015

Classification
– Introduction: The Need for Classification
• - converts data to information, need to identify and group
features
- the basic goal of any mapping exercise (air photo interp,
digitizing)
- can be done in a primitive sense using density slice on one
band
• Complications in Automated Classification
• a. Resolution (high or low)
b. Computer uses only DN (digital equivalent of tone versus
manual interpretation which uses tone, texture, shape,
pattern, shadow etc..)

Raghunath Jha

causes of spatial variations in


reflectance,
• There are many causes of spatial variations in
reflectance, e.g.
• NATURAL RESOURCES (e.g. forest stands)
- purity of stand, background/understory,
age/maturity, density, health/diseases, moisture,
edge (mixed) pixels, sun angle (topography)
• URBAN / HUMAN (e.g. roads or residences)
- amount of grass, types of material, weathering,
pollution/cars, moisture, edge (mixed)
pixels, angle to sun (building shape)

Raghunath Jha

24
02-08-2015

Unsupervised Classification
• user has no or limited 'a priori' knowledge of area
- computer divides pixels based on natural groupings,
clusters, based on differences between groupings
• a. User pre-selects bands / channels and classification (clustering)
algorithm
b. User determines how many classes (approx)
c. Algorithm starts with seed points and Iterates based on group
mean
d. Continues until relatively little change and. fixes groupings
(spectral classes)
e. User then determines what the clusters signify (and merge some if
necessary)
• Software steps:
Determine input channels, classification algorithm, number of
classes and iterations, run

Raghunath Jha

Supervised Classification
User has some a priori' info: can identify homogenous known areas or training
sites from ground knowledge or air photos. Multivariate statistics are
calculated for these known sites based on their multi-spectral DNs
All pixels are evaluated as to which classes they belong to, based on the
training sites:
a. Determine input channels and algorithm
b. Create ground training sites (polygons) for each class
c. Create class signatures and check for differences (separability)
d. Run classifier
Minimum distance: each pixel is assigned to the class whose mean is closest to the
pixel's (in n-dimensions)
Parallelepiped: Each pixel is assigned to the class whose range it falls in (overlap is
a problem = double assignment)
Maximum Likelihood: each pixel is assigned to the class for which it has highest
probability. (slower than other two).
Software steps: Determine input channels, algorithm, create training sites
and classes, run.

Raghunath Jha

25
02-08-2015

Application of RS General
• Land cover: Water, soil, rock, vegetation, grass, etc.. according to
moisture
Water is cooler (darker) during day, but reversed at night .. due to
heat transfer;
Vegetation is cooler than surroundings in day, warmer at night
(leaves have moisture). Transpiration lowers leaf temperature;
contrast less extreme with coniferous (less leaf area/moisture).
Grass is warmer during day than forest, cooler-darker at night;
Damp ground: Effect of absorbed water: cooler in day, warmer at
night
Pavement: Warmer at day and night due to absorbed energy, with
high thermal capacity that retains heat for some hours after sunset.
Packed earth is bit cooler versus concrete and asphalt

Raghunath Jha

Applications - Specific
• Building heat losses, pipeline
leaks (airborne scanners)
• Animal censuses, forest fires
• Linear feature detections: faults
• Thermal effluents, water pollution
• Water temperatures (fish)
• Volcanic eruptions, geothermal activity

Raghunath Jha

26

You might also like