Professional Documents
Culture Documents
Digital Image Processing KSRSAC Module I New
Digital Image Processing KSRSAC Module I New
KSRSAC Module I
Dr S Natarajan
Professor and Key Resource Person
Department of Computer Science and Engineering
PES Institute of Technology
Bengaluru
natarajan@pes.edu
9945280225
Module-1 Digital Image Processing
Digital Data:
Introduction
• Satellite data acquisition –Storage and retrieval – Data
Formats
• Compression
• Satellite System – Data products – Image processing
hardware and software.
I([h,k])
I([50,50])=[40 70 200]
Digital Image Processing
• Digital Image Processing (DIP)
– Is computer manipulation of pictures, or images, that
have been converted into numeric form
Again: Digital Image Processing
• Digital Image Processing (DIP)
– Is computer manipulation of pictures, or images, that
have been converted into numeric form
– Typical operations
include
• Image Compression
• Image Warping
• Contrast Enhancement
• Blur Removal
• Feature Extraction
• Pattern Recognition
Image Processing Goals
• Digital image processing is
a subclass of signal processing
specifically concerned with picture images
– Goals
• To improve image quality for
– human perception (subjective)
– computer interpretation (objective)
7
Distinction Between Fields
• Image processing is not just image-in/image-out
Image
Low Texture mapping Noise reduction
Level Antialiasing Contrast enhancement
Computer Filtering
Features In
Scene Description AI Description Out
10
of
36
What is Digital Image Processing?
Digital image processing focuses on two
major tasks
– Improvement of pictorial information for
human interpretation
– Processing of image data for storage,
transmission and representation for
autonomous machine perception
Some argument about where image
processing ends and fields such as image
analysis and computer vision start
11
of
36
What is DIP? (cont…)
The continuum from image processing to
computer vision can be broken up into low-,
mid- and high-level processes
Low Level Process Mid Level Process High Level Process
Input: Image Input: Image Input: Attributes
Output: Image Output: Attributes Output: Understanding
Examples: Noise Examples: Object Examples: Scene
removal, image recognition, understanding,
sharpening segmentation autonomous navigation
13
History of Image Processing
• Wilhelm Conrad Röntgen 8 November ,1895
– discovery of X-rays – first medical application 1896
- first X-ray image published live experiment demonstrated at
Physikalisch-Medizinische Gesellschaft Würzburg, Germany
Taken on 22.12.1895
Anna Berthe Röntgen
„Hand mit Ringen“
(hand with rings)
Remote Sensing Examples
15
History of Image Processing
• Early 1920s –
Bartlane cable picture transmission system
- used to transmit newspaper images across the Atlantic
- images were coded, sent by telegraph, printed by a special
telegraph printer.
- took about three hours to send an image, first systems
supported 5 grey levels
Historical Background 1921
• Emergence of medical
imaging
29
Applications of Image Processing
Medical Diagnosis
Head CT Scan
Ultrasound
Industrial Applications
• Electronic Defect Detection
Product Testing/QA
Security Applications
Whole Body Scan Vehicle Identification
Biometrics & Finance Applications
Fingerprint Verification Currency verification
Personnel Verification
Seismic Analysis
Mountains Ranges in Tibetan Plain
• Nuclear Medicine
• Astronomical Observations
Gamma Ray Imaging-2
1. Inject a patient with a radioactive isotope that emits gamma rays as it
decays
2. Images are produced from the emissions collected by gamma ray
detectors
Geostationary
Satellites
A geostationary satellite is one
of the satellites which is getting
remote sense data and
located satellite at an altitude of
approximately 36000 kilometres
and directly over the equator.
Image Source: cimss.ssec.wisc.edu
Types of Satellites
Polar-Orbiting
Satellites
A polar orbit is a satellite
which is located near to
above of poles. This
satellite mostly uses for
earth observation by time.
Jensen, 2004
Jensen, 2004
Linear Array CCD
Jensen, 2004
Area Array CCD
Jensen, 2004
Digitization
Jensen, 2004
Overview of how Digital Remotely Sensed Data
are Transformed into Useful Information
Jensen, 2004
Remote Sensing
System used for
Multispectral and
Hyperspectral Data
Collection
Jensen, 2004
Digital Frame Photography Data Collection
Multispectral Scanner Data Collection
Multispectral Scanner Operation
Linear Array (WhiskBroom) Data Collection
Linear Array (Pushbroom) Data Collection
Burj Khalifa by IKONOS
Area Array Hyperspectral Data Collection
Landsat
Multispectral
Scanner (MSS)
and Landsat
Thematic
Mapper (TM)
Sensor System
Characteristics
Jensen, 2004
Jensen, 2004
Landsat Multispectral Scanning System (MSS)
Attitude-control
subsystem
Solar array
Wideband recorder
electronics
Attitude
measurement
sensor
Data
collection Multispectral
antenna Return Beam Scanner (MSS)
Vidicon (RBV)
cameras (3) Jensen, 2004
Landsat MultiSpectral Scanner
Characteristics
Satellites - Landsat 1, 2, & 3
Launched - July 23, 1972; January 22, 1975; March 5, 1978
Sensor - MultiSpectral Scanner (MSS)
Altitude - 917 km
Coverage - 185 km2
Jensen, 2004
Landsat 7 Enhanced Thematic Mapper Plus
Jensen, 2004
Landsat Enhanced Thematic Mapper Plus
Satellite - Landsat 7
Launched - April 15, 1999
Sensor - Enhanced Thematic Mapper Plus (ETM+)
Altitude - 705 km
Coverage - 185 km2
http://geo.arc.nasa.gov/sge/landsat/landsat.html
Landsat 7
Jensen, 2004
Advanced Very High
Resolution Radiometer
(AVHRR) Imagery
Jensen, 2003
Global Normalized Difference Vegetation Index
(NDVI) Image Produced Using Advanced Very High
Resolution Radiometer (AVHRR) Imagery
Jensen, 2003
Advanced Very High
Resolution Radiometer
(AVHRR) Imagery
Jensen, 2003
SPOT Characteristics
Satellites - SPOT 1, 2, 3, and 4, and
Launched - Feb. 22, 1986; Jan. 22, 1990; Sept. 26, 1993;
Mar. 24, 1998
Sensor - High Resolution Visible & panchromatic
Altitude - 822 km
Coverage - 60 km2
Wavelength Electromagnetic Resolution
(micrometers) Region (meters)
Band 1 0.50-0.59 green 20
Band 2 0.61-0.68 red 20
Band 3 0.79-0.89 near-infrared 20
Pan (S1-3) 0.51-0.73 visible to nir 10
Pan (S4) 0.61-0.68 red 10
Band 4 (S4) 1.58-1.75 SWIR 20
http://www.spot.com
SPOT-5 Characteristics
Satellite - SPOT 5
Launched - May 3, 2002
Sensor - High Resolution Visible & panchromatic
Altitude - 830 km
Coverage - 60 km2
http://www.spot.com
Comparison of the Detail of
30 x 30 m Landsat TM Band 3 Data
and SPOT 10 x 10 m Panchromatic
Data of Charleston, SC
Courtesy of
SPOT Image, Inc.
Jensen, 2004
Indian Remote Sensing
Satellite (IRS-1D)
Panchromatic Image of
Downtown San Diego,
CA at 5 x 5 m
Jensen, 2004
IKONOS
Space Imaging
IKONOS Characteristics
http://www.spaceimaging.com/
QuickBird
Digital
Globe
QuickBird Characteristics
http://www.digitalglobe.com/
QuickBird
Color
2.8 meters
November 3, 2002
QuickBird
CIR
2.8 meters
November 3, 2002
QuickBird
Panchromatic
0.7 meters
November 3, 2002
OrbView-3
OrbImage
Specifications
Earth Observing System - Terra Instruments
ASTER - Advanced Spaceborne Thermal Emission and Reflection Radiometer
CERES - Clouds and the Earth’s Radiant Energy System
MISR - Multi-angle Imaging Spectroradiometer
MODIS - Moderate-resolution Imaging Spectroradiometer
MOPITT - Measurement of Pollution in the Troposphere
Jensen, 2000
Earth Observing System Measurements
Jensen, 2000
Earth Observing System Measurements
Jensen, 2000
Earth Observing System Measurements
Spectral Range VNIR 0.4 - 14.4 mm, SWIR 1.6 - 2.5 mm, TIR 8 - 12 mm
Spatial Resolution 15 m (VNIR : 3 bands)
30 m (SWIR: 6 bands)
90 m (TIR: 5 bands)
Jensen, 2000
Terra ASTER Visible – Near Infrared Bands - 15 meters
First day
global
coverage
2,330 km
swath
width
Remote Sensing
System used for
Multispectral and
Hyperspectral Data
Collection
Jensen, 2004
Jensen, 2004
IRS 1-A IRS P-6 Cartosat 2-C
IRS 1-A
SAN
- 2 x MDS9509
SAN Switch
Vault Copy to Tape Data Ingested to High Online – Two Archive Copy to
Send Offsite Perf FC SAN Storage Tape
No
Data Proc High Perf Storage Maintain Inside Perf. Storage
Data > 66 Days
Assume 1.5TB / Day
Yes
No
Medium Perf Storage Maintain Inside Med. Perf
Data > 266 Days Storage
Assuming 1.5TB / Day
Yes
End of Process
10 15 17 20 21 20 50 50 90 90 120 150 100 120 103 210 250 250 190 245
15 16 18 21 23 76 66 55 45 120 176 166 155 85 150 156 166 155 415 220
17 18 20 22 22 80 80 60 70 150 85 80 70 77 135 180 180 160 170 200
BIL
18 20 22 24 25 100 93 97 101 105 103 90 70 120 133 200 0 123 222 215
10 15 17 20 21 15 16 18 21 23 17 18 20 22 22 18 20 22 24 25
20 50 50 90 90 76 66 55 45 120 80 80 60 70 150 100 93 97 101 105
120 150 100 120 103 176 166 155 85 150 85 80 70 77 135 103 90 70 120 133
BSQ
210 250 250 190 245 156 166 155 415 220 180 180 160 170 200 200 0 123 222 215
10 20 120 210 15 50 150 250 17 50 100 250 20 90 120 190 21 90 103 245
BIP
15 76 176 156 16 66 166 166 18 55 155 155 21 45 85 415 23 120 150 220
17 80 85 180 18 80 80 180 20 60 70 160 22 70 77 170 22 150 135 200
18 100 103 200 20 93 90 0 22 97 70 123 24 101 120 222 25 105 133 215
Band sequential (BSQ) format stores
information for the image one band at a
time. In other words, data for all pixels for
band 1 is stored first, then data for all
pixels for band 2, and so on.
• HDF 5
• HDF EOS 5
• NetCDF Classic
• NetCDF-4/HDF5 File Format
• OGC KML( Keyhole Markup Language)
• ASCII File Format Guidelines for Earth Science Data
What is TIFF
• TIFF (Tagged Image File Format), is Aldus-
Adobe’s Public domain tag-based file format
for storing and interchanging raster images.
• TIFF is a rich format for raster image data
from sources such as:
– remotely sensed imagery
– scanners
– painting, drawing, and rendering output from CAD
– Results of geographic analysis.
TIFF File Structure
The TIFF format has a three-level
hierarchy. From highest to lowest,
the levels are:
1) A file Header.
2) One or more directories called
IFDs (Image File Directories),
containing codes and their data, or
pointer to the data.
3) Data.
GeoTIFF
Objects HDF4 Data Model has eight HDF5 Data Model has two
basic objects primary objects
HDF4 has interfaces for the Equivalent operations may
six types of objects supported, require many more calls to
API which allow a user program to HDF5, even for very simple
perform common operations in tasks.
a few calls.
LGSOWG ( Super Structure)
- 132849100101.HDF
- 132849100101.JPG
- 132849100101.META
Sample CDINFO file
• PRODUCT 1 :
• Product Number : 132849100101
• Satellite ID : O2
• Sensor : OCM
• Path-Row : 009-013
• Date of Acquisition : 27FEB2013
• Product Code : STLCCL2HJ
• Orbit Number : 2278
• Scan Lines : 6026
• Pixels : 3730
• Bytes Per Pixel :4
• Image Record Length(Bytes) : 14920
• No of Volume : 1/1
Module-1 Digital Image Processing
Digital Data:
• Introduction
• Satellite data acquisition –Storage and retrieval – Data
Formats
• Compression
• Satellite System – Data products – Image processing
hardware and software
Jensen, 2003
ENVI Interface
Jensen, 2003
ERDAS Interface
Jensen, 2003
Image Processing System
Hardware /Software Considerations
* Central Processing Unit
* Operating System and Applications Software
* Random Access Memory (RAM)
* Mass Storage (hard disk, CD, DVD)
* Arithmetic Co-processor
* Image Processing Memory (Graphics card)
- CRT Screen display resolution
- CRT screen color resolution
* Serial versus Parallel Processing Jensen, 2003
Moore’s Law
Jensen, 2003
Image Processing System Considerations
Type of Computer:
* Mainframe (> 32-bit CPU)
* Workstation (> 32-bit CPU)
* Personal Computers (16 to 32-bit CPU)
Jensen, 2003
Image Processing System
Hardware /Software Considerations
Jensen, 2003
Remote Sensing Data Formats
* Band interleaved by line (BIL)
* Band interleaved by pixel (PIP)
* Band sequential (BSQ)
* Run-length encoding
Jensen, 2003
Image Processing System
Hardware /Software Considerations
* Central Processing Unit (CPU)
* Random Access Memory (RAM)
* Mass Storage (hard disk, CD, DVD)
* Arithmetic Co-processor
* Image Processing Memory (Graphics card)
- CRT Screen display resolution
- CRT screen color resolution
* Serial versus Parallel Processing
* Storage and archiving capability Jensen, 2003
Serial Versus Parallel Processing
* Requires more than one CPU
* Requires software that can parse (distribute) the
digital image processing to the various CPUs by
- task, and/or
- line, and/or
- column.
Jensen, 2003
Jensen, 2003
Remote Sensing Data
Storing and Archiving
Considerations
* Type of Media
* Access to a read-write mechanism that works.
Jensen, 2003
Components of an Image Processing
System
Longevity
Jensen, 2003
Image Processing System Functions
* Preprocessing (Radiometric and Geometric)
* Display and Enhancement
* Information Extraction
* Image Lineage
* Image and Map Cartographic Composition
*Geographic Information Systems (GIS)
* Integrated Image Processing and GIS
* Utilities
Jensen, 2003
The Major Commercial
Digital Image Processing Systems
- ERDAS
- ENVI
- IDRISI
- ERMapper
- PCI
- Lots of photogrammetric
Jensen, 2003
The Major Public
Digital Image Processing Systems
- JPL VICAR-IBIS
- MultiSpec
- C-Coast
Jensen, 2003
Image Processing Software
ERDAS IMAGINE Arc/info live link, no conversion needed
PCI EASI PACE Arc/Info GeoGateway for multiple formats
Arc/Info GRID various basic raster formats, tif, sun, gis, lan,
img, bil, bip, bsq, grass, adrg, rlc
ArcVew ERDAS lan, img, grid, tif
ENVI/IDL imports shapefiles, e00, dxf, USGS, SDTS, dlg,
exports ArcView grid, uses own vector format
ERMAPPER various raster formats, import of dxf and
SeisWorks, uses own vector format
MATLAB Image Processing Toolkit
OpenCV of Intel img, tif, jpg, jp2(jpeg2000), png, pgm
Scikit Image
Scilab
other packages: CVIPtools, IDRISI, ILWIS, OpenJ, GraphicsMagick,
blippAR...
158
of
36
Key Stages in Digital Image Processing
Image Morphological
Restoration Processing
Image
Segmentation
Enhancement
Image Object
Acquisition Recognition
Representation
Problem Domain
& Description
Colour Image Image
Processing Compression
Module-1 Digital Image Processing
Digital Data:
• Introduction
• Satellite data acquisition –Storage and retrieval – Data
Formats
• Compression
• Satellite System – Data products – Image processing
hardware and software
• Sensor related
• Earth related
Geometric Corrections
• Changes the location of pixels within
image
• Landsat M S S :
cos(i 90 )
For Landsat or SPOT
cos(9 ) 0.98769
• NOAA AVHRR
time
Landsat M S S :
395 m or 5 pixels
Non-square pixels
One
Pixel
Spacecraft
Motion
Scan
Correcting geometric distortions
(x,y)
( u(x,y), v(x,y)
y v )
x u
For each new pixel (x, y ) in the map plane we find the equivalent point
(u, v ) in the image plane. W e need to know the functions u(x, y), v(x, y )
– we can get these in two ways
• Nearest neighbour
p q
– Use four pixels surrounding u, v v
– Interpolate between e and g → p,
g h
u = a0 + a 1 x + a 2 y + a 3 xy + a 4 x 2 + a 5 y 2
v = b0 + b1x + b2y + b3xy + b4x2 + b5y2
W e find the a and b coefficients by choosing a number of points which
we can identify in both the map and the image. These are called
CONTROL POINTS.
u = a0 + a 1 x + a 2 y + a 3 xy + a 4 x 2 + a5y 2
v = b0 + b1x + b2y + b3xy + b4x2 + b5y2
Usually choose more than 6 control points and use least squares
solution. The second equation gives another set of equations for the b
coefficients which are solved exactly like the a set.
Geometric Rectification
Module-1 Digital Image Processing
Digital Data:
• Introduction
• Satellite data acquisition –Storage and retrieval – Data
Formats
• Compression
• Satellite System – Data products – Image processing
hardware and software
Campbell 10.4
Radiometric Corrections
1. Correction for detector errors
• Line drop
• Destriping
(both random and periodic)
2. Sun angle correction
– for comparison and mosaic images acquired from different
time of the year
3. Atmospheric corrections
• Histogram adjustment
• Atmospheric radiative transfer models
4. Conversion from DN to radiance
5. Conversion from radiance to reflectance
6. BRDF corrections
Sensor corrections
Line Dropout
43 47 51 57
40 46 50 54
0 0 0 0
38 40 42 50
Images: Lillesand-Kiefer
Campbell 10.4
Sensor corrections
Striping
Local averaging
Normalization
Images: Lillesand-Kiefer
Campbell 10.4
Errors: Sensor Failure & Calibration
Sensor problems show as striping or missing lines of data:
Missing data due to sensor failure results in a line of DN values -
every 16th line for TM data .. As there are 16 sensors for each
band, scanning 16 lines at a time (or 6th line for MSS).
Pre anomaly
Post
anomaly
after
correction
algorithm
Radiometric Corrections
1. Correction for detector errors
• Line drop
• Destriping
(both random and periodic)
2. Sun angle correction
– for comparison and mosaic images acquired from different
time of the year
3. Atmospheric corrections
• Histogram adjustment
• Atmospheric radiative transfer models
4. Conversion from DN to radiance
5. Conversion from radiance to reflectance
6. BRDF corrections
Sun angle correction
• Position of the sun relative to the
earth changes depending on
time of the day and the day of
the year
Zenith
• Solar elevation angle: Time- and
location dependent
Landsat 7 ETM+ color infrared composites acquired with different sun angle.
(A) The left image was acquired with a sun elevation of 37° and right image
(B) with a sun elevation of 42°. The difference in reflectance is clearly shown.
(C) (B) The left image was corrected to meet the right image.
Spectral Irradiance & Earth-Sun Distance
Direct Illumination
Radiometric Correction
Direct Illumination
Scattered Illumination
Radiometric Correction
Direct Illumination
Scattered Illumination
Adjacent Reflections
Path Radiance
1) Histogram adjustment
• Clear sky
• Hazy sky
2) Physical Models
Campbell 10.4
Atmospheric Interference: clouds
clouds affect all visible and IR bands, hiding features twice: once with the
cloud, once with its shadow. We CANNOT eliminate clouds, although we
might be able to assemble cloud-free parts of several overlapping scenes (if
illumination is similar), and correct for cloud shadows (advanced).
Campbell 10.4
Simple Atmospheric Corrections – Histogram Adjustment
Hazy Atmosphere
Wide range of
brightness values
In this case, the minimum
Added brightness
of atmosphere value is higher, and the
histogram shape has changed
Brightness values
Campbell 10.4
Atmospheric Correction Models
Physical models simulate the physical process of scattering
at the level of individual particles and molecules
Absorption by gases
scattering by aerosols
LOWTRAN 7
MODTRAN
CAM5S, 6S
Campbell 10.4
Atmospheric Effects
TE
LAPP LP
Lapp=apparent radiance
measured by sensor
ρ = reflectance of object
T = atmospheric transmittance
E = irradiance on object,
incoming
Lp = path radiance/haze, from
the atmosphere and not from
the object
Radiometric Correction
Rayleigh Scattering
•Rayleigh Scattering
•Caused by particles much smaller
than a wavelength
•Declines with the fourth power of
the wavelength
•Responsible for blue skies, red
sunsets
•Key element in radiometric
correction of images
Haze Reduction
• Aerial and satellite images often contain haze.
Presence of haze reduces image contrast and
makes visual examination of images difficult.
• Due to Rayleigh scattering
– Particle size responsible for effect smaller
than the radiation’s wavelength (e.g. oxygen
and nitrogen)
• Haze has an additive effect resulting in higher
DN values
• Scattering is wavelength dependent
• Scattering is more pronounced in shorter
wavelengths and negligible in the NIR
Haze Reduction
• One means of haze compensation in
multispectral data is to observe the radiance
recorded over target areas of zero reflectance
• For example, the reflectance of deep clear
water is zero in NIR region of the spectrum
• Therefore, any signal observed over such an
area represents the path radiance
• This value can be subtracted from all the
pixels in that band
Haze Reduction
(a) The aerial image before haze removal (b) The aerial image after haze removal
Radiometric Corrections
1. Correction for detector errors
• Line drop
• Destriping
(both random and periodic)
2. Sun angle correction
– for comparison and mosaic images acquired from different
time of the year
3. Atmospheric corrections
• Histogram adjustment
• Atmospheric radiative transfer models
4. Conversion from DN to radiance
5. Conversion from radiance to reflectance
6. BRDF corrections
Atmospheric Corrections
ET
Ltot Lp Ltot= radiance measured
by the sensor
= reflectance of the target
E = irradiance on the target
L tot L p T = transmissivity
of the atmosphere
ET Lp= path radiance (radiance
due to the atmosphere)
L & K 7.2
Atmospheric Corrections
E0 coss
E = ----------------
d2
L & K 7.2
Radiometric Corrections
1. Correction for detector errors
• Line drop
• Destriping
(both random and periodic)
2. Sun angle correction
– for comparison and mosaic images acquired from different
time of the year
3. Atmospheric corrections
• Histogram adjustment
• Atmospheric radiative transfer models
4. Conversion from DN to radiance
5. Conversion from radiance to reflectance
6. BRDF(Bidirectional Reflectance Distribution Function) corrections
What is a BRDF?
Typical light-matter interaction scenario:
Reflected Light Incoming Light
Or
BRDF i , i , o , o
wi
Normal
Small area
Neighborhood
of directions
Small surface element
Bidirectional Reflectance Distribution Function
(BRDF) Correction
Structures like trees cast shadows that change the amount of
light that reaches a sensor depending on its
view zenith angle
before
afte
r