You are on page 1of 228

Digital Image Processing

KSRSAC Module I
Dr S Natarajan
Professor and Key Resource Person
Department of Computer Science and Engineering
PES Institute of Technology
Bengaluru
natarajan@pes.edu
9945280225
Module-1 Digital Image Processing
Digital Data:
Introduction
• Satellite data acquisition –Storage and retrieval – Data
Formats
• Compression
• Satellite System – Data products – Image processing
hardware and software.

Image Rectification and Restoration:


• Geometric correction
• Radiometric correction
• Noise removal
The Digital Image
101 109 110
99 90 94 r
112 123 108
123 131 141
121 112 118 g
134 145 132
38 46 65
75 66 86 b
88 99 100

The Digital Image


h
0

I([h,k])

I([50,50])=[40 70 200]
Digital Image Processing
• Digital Image Processing (DIP)
– Is computer manipulation of pictures, or images, that
have been converted into numeric form
Again: Digital Image Processing
• Digital Image Processing (DIP)
– Is computer manipulation of pictures, or images, that
have been converted into numeric form

– Typical operations
include
• Image Compression
• Image Warping
• Contrast Enhancement
• Blur Removal
• Feature Extraction
• Pattern Recognition
Image Processing Goals
• Digital image processing is
a subclass of signal processing
specifically concerned with picture images

– Goals
• To improve image quality for
– human perception (subjective)
– computer interpretation (objective)

• Develop methods and applications


– to compress images
– to provide efficient storage and transmission
Image Processing, Computer Vision and
Computer Graphics
Image Processing – Image to Image
Computational -- Image to Image
Photography
Computer Vision -- Image to Model
Computer Graphics -- Model to Image

7
Distinction Between Fields
• Image processing is not just image-in/image-out

• But it does work to distinguish relational aspects of some fields


Related Fields
Image Processing
Image-In / Image Out

Image
Low Texture mapping Noise reduction
Level Antialiasing Contrast enhancement
Computer Filtering

Graphics Mid Extract Attributes


Computer
Segmentation
Description In Level Edge Detection Vision
Image Out Image In
High Object Recognition Features Out
Level Cognitive Functions

Features In
Scene Description AI Description Out
10
of
36
What is Digital Image Processing?
Digital image processing focuses on two
major tasks
– Improvement of pictorial information for
human interpretation
– Processing of image data for storage,
transmission and representation for
autonomous machine perception
Some argument about where image
processing ends and fields such as image
analysis and computer vision start
11
of
36
What is DIP? (cont…)
The continuum from image processing to
computer vision can be broken up into low-,
mid- and high-level processes
Low Level Process Mid Level Process High Level Process
Input: Image Input: Image Input: Attributes
Output: Image Output: Attributes Output: Understanding
Examples: Noise Examples: Object Examples: Scene
removal, image recognition, understanding,
sharpening segmentation autonomous navigation

In this course we will


stop here
Historical Background 1827
Remote Sensing Examples

•First aerial photo credited


to Frenchman Felix
Tournachon in Bievre
Valley, 1858.
•Boston from balloon
(oldest preserved aerial
photo), 1860, by James
Wallace Black.

13
History of Image Processing
• Wilhelm Conrad Röntgen 8 November ,1895
– discovery of X-rays – first medical application 1896
- first X-ray image published live experiment demonstrated at
Physikalisch-Medizinische Gesellschaft Würzburg, Germany

Taken on 22.12.1895
Anna Berthe Röntgen
„Hand mit Ringen“
(hand with rings)
Remote Sensing Examples

•Kites (still used!) Panorama of San Francisco, 1906.


•Up to 9 large kites used to carry camera weighing 23kg.

15
History of Image Processing
• Early 1920s –
Bartlane cable picture transmission system
- used to transmit newspaper images across the Atlantic
- images were coded, sent by telegraph, printed by a special
telegraph printer.
- took about three hours to send an image, first systems
supported 5 grey levels
Historical Background 1921

• Image transmitted via Telegraph


– using the Bartlane cable picture transmission system
– Images were transferred by submarine cable between London
and New York
– Printed using a special printer rigged with typefaces simulating
halftones
Historical Background 1922
• Alternate printing
method using tape
perforations
– improved tonal quality
– improved resolution

• Still limited to 5 levels of


grey
19
of
36
History of DIP (cont…)
Mid to late 1920s: Improvements to the
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

Bartlane system resulted in higher quality


images
– New reproduction
processes based
on photographic
techniques
– Increased number
of tones in Improved
digital image Early 15 tone digital
reproduced images image

- Still limited to 5 levels of grey


Historical Background – Late 1920s

• Improved up to an amazing 15 levels of gray


21
of
History of Digital Image Proc
36
(cont…) 1960-1970
1960s: Improvements in computing
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

technology and the onset of the space race


led to a surge of work in digital image
processing
– 1964: Computers used to
improve the quality of
images of the moon taken
by the Ranger 7 probe
– Such techniques were used
A picture of the moon taken
in other space missions by the Ranger 7 probe
including the Apollo landings minutes before landing
22
of
36
History of Digital Image Processing
• When computer vision first started out in the early 1970s, it was
viewed as the visual perception component of an ambitious
agenda to mimic human intelligence and to endow robots with
intelligent behaviour
• At the time, it was believed by some of the early pioneers of Artificial
Intelligence and Robotics (at places such as MIT, Stanford, and
CMU) that solving the “visual input” problem would be an easy
step along the path to solving more difficult problems such as
higher-level reasoning and planning
• According to one well-known story, in 1966, Marvin Minsky at MIT
asked his undergraduate student Gerald Jay Sussman to “spend the
summer linking a camera to a computer and getting the computer
to describe what it saw” (Boden 2006, p. 781)
23
of
36
History of Digital Image Processing
1960s - Rosenfeld and Pfaltz 1966; Rosenfeld and Kak 1976 had a
desire to recover the three-dimensional structure of the world
from images and to use this as a stepping stone towards
full scene understanding
1970s

• Winston (1975) and Hanson and Riseman (1978) provide


two nice collections of classic papers from this early period
History of Digital Image Processing History 1970-1980

• Emergence of medical
imaging

• 1979 Nobel Peace Prize


for Invention of
Computerized Axial
Tomography (CAT)
– Sir Godfrey Housefield
– Professor Allan Cormack
25
of
36
History of Digital Image Processing
1980s : The use of digital image processing
techniques has exploded and they are now
used for all kinds of tasks in all kinds of
areas
– Image enhancement/restoration
– Artistic effects
– Medical visualisation
– Industrial inspection
– Law enforcement
– Human Computer Interfaces(HCI)
History of Digital Image Processing History 1980-1990
• Satellite and Remote Sensing
– LANDSAT – NOAA GEOS
• 8 spectral bands, • satellite sensor array
• 15 to 60 meter spatial resolution,
• 16 day temporal resolution

Image from 2009, courtesy of NASA Multispectral image of Hurricane Gustav


Fire San Bernardino National Forest courtesy of NOAA 2008
History of Digital Image Processing 1990-2000

• Morphing and visual effects algorithms


• JPEG and MPEG compression
• Wavelet Transforms

Image morph from Michael Jackson


Music Video: Black or White

Dr. Who Image morphs


antonybennison.com
History of Digital Image Processing 2000 and beyond
2000-
• Image based modelling and rendering
rely on a set of two-dimensional images of a scene to generate a
three-dimensional model and then render some novel views of this
scene
• Texture synthesis and inpainting
• Computational photography
• Feature based recognition
• MRF inference algorithms
• Category recognition
• Learning
• Large Annotated Datasets available
ImageNet, CIFAR1,CIFAR100, coco
• Start of Video processing
Intersection of Vision and Graphics

rendering shape estimation


modeling
surface design - shape motion estimation
- light
animation - motion recognition
- optics
- images 2D modeling
user-interfaces IP

Computer Graphics Computer Vision

29
Applications of Image Processing
Medical Diagnosis

Digital Mammogram MRI of Knee & Spine

Head CT Scan

Ultrasound
Industrial Applications
• Electronic Defect Detection

Product Testing/QA
Security Applications
Whole Body Scan Vehicle Identification
Biometrics & Finance Applications
Fingerprint Verification Currency verification

Personnel Verification
Seismic Analysis
Mountains Ranges in Tibetan Plain

Seismic patterns showing oil


(natural resources) traps
Satellite Applications

Weather Forecast Aerial Analysis


Space Explorations
Moon surface observation

North Pole observation


Imaging Spectrum

Images can work in a wide energy spectrum


Gamma Ray Imaging-1

• Nuclear Medicine
• Astronomical Observations
Gamma Ray Imaging-2
1. Inject a patient with a radioactive isotope that emits gamma rays as it
decays
2. Images are produced from the emissions collected by gamma ray
detectors

• Positron Emission Tomography (PET)


Imaging in Radio Band
Magnetic Resonance Imaging (MRI)
• Place a patient in a powerful magnet and passes radio waves through his
or her body in short pulses.
Module-1 Digital Image Processing
Digital Data:
• Introduction
• Satellite data acquisition –Storage and retrieval – Data
Formats
• Compression
• Satellite System – Data products – Image processing
hardware and software.

Image Rectification and Restoration:


• Geometric correction
• Radiometric correction
• Noise removal
There are two fundamental ways
to obtain digital imagery:

1) acquire remotely sensed imagery in an analog format


(often referred to as hard-copy) and then convert it to a
digital format through the process of digitization, and

2) acquire remotely sensed imagery already in a digital


format, such as that obtained by the Landsat 7 Enhanced
Thematic Mapper Plus (ETM+) sensor system.
Types of Satellites

Geostationary
Satellites
A geostationary satellite is one
of the satellites which is getting
remote sense data and
located satellite at an altitude of
approximately 36000 kilometres
and directly over the equator.
Image Source: cimss.ssec.wisc.edu
Types of Satellites

Polar-Orbiting
Satellites
A polar orbit is a satellite
which is located near to
above of poles. This
satellite mostly uses for
earth observation by time.

Image Source: globalmicrowave.org


Satellite Data Reception and Processing Chain
Base configuration of Ground-based complex of receiving and processing remote
sensing data includes:
• Antenna System (reception signals from on-board the Satellite);
• Receiving and processing path (decoding and transformation of the received signal to
a level, suitable for further processing);
• Software (formation of requests for recording, planning sessions, processing and
correction of data in accordance with individual characteristics of the satellite);
• Licenses to operate with specific satellites (providing access to data and the ability to
process them)
Servo Systems, Data Receive Chains, Tracking System, Automation System and level ‘0’
Systems, Control Room, Antenna Towers and new Communications Links

IMGEOS Facility at Shadnagar


7.5 Meter Antenna
Ground-based Complex for the Remote Sensing Data
Receiving and Processing provides:

• Forming the applications for the earth's surface survey, planning


sessions with the spacecraft;
• Reception data;
• Pre-processing of data, the allocation of target arrays and
proprietary information;
• Demodulating, decoding, radiometric correction, filtering, dynamic
range conversion, forming the overview images and performing
other operations of digital pre-processing;
• Cataloguing, archiving information and its subsequent transfer to the
consumer;
• Geometric image correction and geo-referencing, using data about
the parameters of angular and linear motion of spacecraft (SC) and /
or reference points on the ground;
• License access to data, received from remote sensing satellites;
• Analysis of the quality of the images received with using expertise
and program methods.
X-band Ground Station
Jensen, 2004
Remote Sensing Raster (Matrix) Data Format

Jensen, 2004
Jensen, 2004
Linear Array CCD

Linear Array CCD Flatbed Digitizer

Jensen, 2004
Area Array CCD

Area Array CCD Image Digitizer

Jensen, 2004
Digitization

Jensen, 2004
Overview of how Digital Remotely Sensed Data
are Transformed into Useful Information

Jensen, 2004
Remote Sensing
System used for
Multispectral and
Hyperspectral Data
Collection

Jensen, 2004
Digital Frame Photography Data Collection
Multispectral Scanner Data Collection
Multispectral Scanner Operation
Linear Array (WhiskBroom) Data Collection
Linear Array (Pushbroom) Data Collection
Burj Khalifa by IKONOS
Area Array Hyperspectral Data Collection
Landsat
Multispectral
Scanner (MSS)
and Landsat
Thematic
Mapper (TM)
Sensor System
Characteristics

Jensen, 2004
Jensen, 2004
Landsat Multispectral Scanning System (MSS)

Attitude-control
subsystem
Solar array

Wideband recorder
electronics

Attitude
measurement
sensor
Data
collection Multispectral
antenna Return Beam Scanner (MSS)
Vidicon (RBV)
cameras (3) Jensen, 2004
Landsat MultiSpectral Scanner
Characteristics
Satellites - Landsat 1, 2, & 3
Launched - July 23, 1972; January 22, 1975; March 5, 1978
Sensor - MultiSpectral Scanner (MSS)
Altitude - 917 km
Coverage - 185 km2

Wavelength Electromagnetic Resolution


(micrometers) Region (meters)
Band 1 0.5-0.6 green 80
Band 2 0.6-0.7 red 80
Band 3 0.7-0.8 near-infrared 80
Band 4 0.8-1.1 near-infrared 80
Inclination of the Landsat Orbit to
Maintain A Sun-synchronous Orbit
N

Jensen, 2004
Landsat 7 Enhanced Thematic Mapper Plus

Jensen, 2004
Landsat Enhanced Thematic Mapper Plus
Satellite - Landsat 7
Launched - April 15, 1999
Sensor - Enhanced Thematic Mapper Plus (ETM+)
Altitude - 705 km
Coverage - 185 km2

Wavelength Electromagnetic Resolution


(micrometers) Region (meters)
Band 1 0.45-0.52 blue 30
Band 2 0.52-0.60 green 30
Band 3 0.63-0.69 red 30
Band 4 0.76-0.90 near-infrared 30
Band 5 1.55-1.75 mid-infrared 30
Band 6 10.42-12.50 thermal 60
Band 7 2.08-2.35 mid-infrared 30
Band 8 0.50-0.90 (pan.) visible to nir 15

http://geo.arc.nasa.gov/sge/landsat/landsat.html
Landsat 7

Jensen, 2004
Advanced Very High
Resolution Radiometer
(AVHRR) Imagery

Jensen, 2003
Global Normalized Difference Vegetation Index
(NDVI) Image Produced Using Advanced Very High
Resolution Radiometer (AVHRR) Imagery

Jensen, 2003
Advanced Very High
Resolution Radiometer
(AVHRR) Imagery

Jensen, 2003
SPOT Characteristics
Satellites - SPOT 1, 2, 3, and 4, and
Launched - Feb. 22, 1986; Jan. 22, 1990; Sept. 26, 1993;
Mar. 24, 1998
Sensor - High Resolution Visible & panchromatic
Altitude - 822 km
Coverage - 60 km2
Wavelength Electromagnetic Resolution
(micrometers) Region (meters)
Band 1 0.50-0.59 green 20
Band 2 0.61-0.68 red 20
Band 3 0.79-0.89 near-infrared 20
Pan (S1-3) 0.51-0.73 visible to nir 10
Pan (S4) 0.61-0.68 red 10
Band 4 (S4) 1.58-1.75 SWIR 20

http://www.spot.com
SPOT-5 Characteristics
Satellite - SPOT 5
Launched - May 3, 2002
Sensor - High Resolution Visible & panchromatic
Altitude - 830 km
Coverage - 60 km2

Wavelength Electromagnetic Resolution


(micrometers) Region (meters)
Band 1 0.50-0.59 green 10
Band 2 0.61-0.68 red 10
Band 3 0.79-0.89 near-infrared 10
Pan 0.51-0.73 visible to nir 2.5
Pan (XS-mode) 0.50-0.89 visible to nir 5

http://www.spot.com
Comparison of the Detail of
30 x 30 m Landsat TM Band 3 Data
and SPOT 10 x 10 m Panchromatic
Data of Charleston, SC

Courtesy of
SPOT Image, Inc.

Jensen, 2004
Indian Remote Sensing
Satellite (IRS-1D)
Panchromatic Image of
Downtown San Diego,
CA at 5 x 5 m

Jensen, 2004
IKONOS

Space Imaging
IKONOS Characteristics

Launched - September 24, 1999


Altitude - 682 km
Coverage - 11 km2

Wavelength Electromagnetic Resolution


(micrometers) Region (meters)
Band 1 0.45-0.90 (pan.) visible to nir 1
Band 2 0.45-0.52 blue 4
Band 3 0.52-0.60 green 4
Band 4 0.63-0.69 red 4
Band 5 0.76-0.90 near-infrared 4

http://www.spaceimaging.com/
QuickBird

Digital
Globe
QuickBird Characteristics

Launched - October 18, 2001


Altitude - 450 km
Coverage - 16.5 km2

Wavelength Electromagnetic Resolution


(micrometers) Region (meters)
Band 1 0.45-0.90 (pan.) visible to nir 0.61 to 0.72
Band 2 0.45-0.52 blue 2.44 to 2.88
Band 3 0.52-0.60 green 2.44 to 2.88
Band 4 0.63-0.69 red 2.44 to 2.88
Band 5 0.76-0.90 near-infrared 2.44 to 2.88

http://www.digitalglobe.com/
QuickBird

Color
2.8 meters
November 3, 2002
QuickBird

CIR
2.8 meters
November 3, 2002
QuickBird

Panchromatic
0.7 meters
November 3, 2002
OrbView-3

OrbImage

Specifications
Earth Observing System - Terra Instruments
ASTER - Advanced Spaceborne Thermal Emission and Reflection Radiometer
CERES - Clouds and the Earth’s Radiant Energy System
MISR - Multi-angle Imaging Spectroradiometer
MODIS - Moderate-resolution Imaging Spectroradiometer
MOPITT - Measurement of Pollution in the Troposphere

Jensen, 2000
Earth Observing System Measurements

Discipline Measurement EOS-AM Instruments

Atmosphere Cloud Properties MODIS, MISR, ASTER


Radiative Energy Fluxes CERES, MODIS, MISR
Precipitation
Tropospheric Chemistry MOPITT
Stratospheric Chemistry
Aerosol Properties MISR, MODIS
Atmospheric Temperature MODIS
Atmospheric Humidity MODIS
Lightning

Jensen, 2000
Earth Observing System Measurements

Discipline Measurement EOS-AM Instruments

Land Land Cover/Land Use Change MODIS, MISR, ASTER


Vegetation Dynamics MODIS, MISR, ASTER
Surface Temperature MODIS, ASTER
Fire Occurrence MODIS, ASTER
Volcanic Effects MODIS, MISR, ASTER
Surface Wetness

Jensen, 2000
Earth Observing System Measurements

Discipline Measurement EOS-AM Instruments

Ocean Surface Temperature MODIS


Phytoplankton MODIS, MISR
Dissolved Organic Matter MODIS, MISR
Surface Wind Fields
Ocean Surface Topography

Cryosphere Land Ice Change ASTER


Sea Ice MODIS, ASTER
Snow Cover MODIS, ASTER

Solar Radiation Total Solar Radiation


Ultraviolet Spectral Irradiance
Earth Observing System - Terra Instruments

MODIS - Moderate-resolution Imaging Spectroradiometer

Spectral Range 0.4 - 14.4 mm


Spectral Coverage +55˚, 2330 km swath
Spatial Resolution 250 m (2 bands), 500 m (5 bands), 1000 m (29 bands)

ASTER - Advanced Spaceborne Thermal Emission and Reflection Radiometer

Spectral Range VNIR 0.4 - 14.4 mm, SWIR 1.6 - 2.5 mm, TIR 8 - 12 mm
Spatial Resolution 15 m (VNIR : 3 bands)
30 m (SWIR: 6 bands)
90 m (TIR: 5 bands)

Jensen, 2000
Terra ASTER Visible – Near Infrared Bands - 15 meters

(Advanced Spaceborne Thermal Emission and Reflection Radiometer)

Launched - December 18, 1999 http://terra.nasa.gov/


Moderate Resolution Imaging
Spectroradiometer (MODIS) Onboard Terra

First day
global
coverage

2,330 km
swath
width
Remote Sensing
System used for
Multispectral and
Hyperspectral Data
Collection

Jensen, 2004
Jensen, 2004
IRS 1-A IRS P-6 Cartosat 2-C

IRS 1-A

IRS P-6 (Resourcesat-1)


Cartosat 2-C
IRS Series of Satellites
IRS Satellites
Cartosat/Oceansat Satellites
Module-1 Digital Image Processing
Digital Data:
• Introduction
• Satellite data acquisition –Storage and retrieval – Data
Formats
• Compression
• Satellite System – Data products – Image processing
hardware and software.

Image Rectification and Restoration:


• Geometric correction
• Radiometric correction
• Noise removal
Storage and Retrieval
Storage and Retrieval
SAN Components
IMGEOS Project (Integrated Mission for Ground Earth
Observation System
IMGEOS Project eliminate manual procedures to
the large extent
– Deliver the products to the end user within least
amount of time after data is acquired from the
Satellite for emergency products.
IMGEOS
– consolidation
▪ data acquisition
▪ processing
▪ dissemination

© Copyright 2012 EMC Corporation. All rights reserved. 3


Project IMGEOS …. Contd.
To achieve the objective, it requires
Integration of
– different servers/workstations
– ancillary data processing,
– value addition
– quality checking
– High Performance SAN
– Additional workstations for photo-processing,
– Post Process media generation
– Dissemination of information using
Internet/media.

© Copyright 2012 EMC Corporation. All rights reserved. 4


IMGEOS Architecture
Data Processing Group
Data Ingest Div

SAN
- 2 x MDS9509
SAN Switch

Meta Data Servers in


High Available
Configuration

EMC CLARiiON CX4-480 – 2 Nos High Performance Scalable


Medium Performance Storage – 400TB Intelligent Tape Library

© Copyright 2012 EMC Corporation. All rights reserved. 5


StorNext DLC Clients Across WAN

© Copyright 2012 EMC Corporation. All rights reserved. 6


IMGEOS – Infrastructure
Symmetrix – 100 TB storage.
– 32 front end and 32 Backend Ports

CLARiiON CX4-480C – 200TB - 2 Nos


– 8 front end and 8 Backend Ports

Cisco MDS 9509 Enterprise Director – 2 Nos


– 144 Ports Each
StorNext File system
– 6PB storNext file system, 3PB Vault Licence.
– HSM for movement between high Performance, Archive and
Tape Library.
Control Center for Symmetrix, Clariion, SMI-S Agent for
Tape library

© Copyright 2012 EMC Corporation. All rights reserved. 7


IMGEOS - Data Importance and
Performance
Satellite images are one time generated and capture
Vast amount of information.
Same data is used by multiple departments for
different processing.
Old Data is used for reference and data processing.
Data should be preserved for long term(>20 yrs.)
Data should be accessed at highest speeds, when
required.
Leverage Information Life Cycle and HSM.
Data is most active within 30 days of generation.

© Copyright 2012 EMC Corporation. All rights reserved. 8


IMGEOS Work Flow
Data Reception
Facility
Start of Process

Vault Copy to Tape Data Ingested to High Online – Two Archive Copy to
Send Offsite Perf FC SAN Storage Tape

No
Data Proc High Perf Storage Maintain Inside Perf. Storage
Data > 66 Days
Assume 1.5TB / Day

Yes

Move to Medium Perf Storage

No
Medium Perf Storage Maintain Inside Med. Perf
Data > 266 Days Storage
Assuming 1.5TB / Day

Yes

Delete Data from Disk Storage

End of Process

© Copyright 2012 EMC Corporation. All rights reserved. 9


Why Symmetrix and StorNext
Symmetrix offers :
– High Performance SAN
– Very High availability
– Striping of data across multiple backend ports,
disks, disk Enclosures to offer high performance.
StorNext file system offers
– High Performance file system
– High Speed pool of Disks
– Allow access to files based on a LAN or SAN
– Expands pools across Multi-Tiers including tapes

© Copyright 2012 EMC Corporation. All rights reserved. 10


Module-1 Digital Image Processing
Digital Data:
• Introduction
• Satellite data acquisition –Storage and retrieval – Data
Formats
• Compression
• Satellite System – Data products – Image processing
hardware and software.

Image Rectification and Restoration:


• Geometric correction
• Radiometric correction
• Noise removal
Image file formats
 BSQ (Band Sequential Format):
 each line of the data followed immediately by the next line in the same
spectral band. This format is optimal for spatial (X, Y) access of any part of a
single spectral band. Good for multispectral images

 BIP (Band Interleaved by Pixel Format):


 the first pixel for all bands in sequential order, followed by the second pixel for
all bands, followed by the third pixel for all bands, etc., interleaved up to the
number of pixels. This format provides optimum performance for spectral (Z)
access of the image data. Good for hyperspectral images

 BIL (Band Interleaved by Line Format):


 the first line of the first band followed by the first line of the second band,
followed by the first line of the third band, interleaved up to the number of
bands. Subsequent lines for each band are interleaved in similar fashion. This
format provides a compromise in performance between spatial and spectral
processing and is the recommended file format for most ENVI processing
tasks. Good for images with 20-60 bands
Band 2 Band 3 Band 4
20 50 50 90 90 120 150 100 120 103 210 250 250 190 245
76 66 55 45 120 176 166 155 85 150 156 166 155 415 220
80 80 60 70 150 85 80 70 77 135 180 180 160 170 200
100 93 97 101 105 103 90 70 120 133 200 0 123 222 215

Matrix notation for band 2


1,1,2 2,1,2 3,1,2 4,1,2 5,1,2
1,2,2 2,2,2 3,2,2 4,2,2 5,2,2
1,3,2 2,3,2 3,3,2 4,3,2 5,3,2
1,4,2 2,4,2 3,4,2 4,4,2 5,4,2

10 15 17 20 21 20 50 50 90 90 120 150 100 120 103 210 250 250 190 245
15 16 18 21 23 76 66 55 45 120 176 166 155 85 150 156 166 155 415 220
17 18 20 22 22 80 80 60 70 150 85 80 70 77 135 180 180 160 170 200
BIL
18 20 22 24 25 100 93 97 101 105 103 90 70 120 133 200 0 123 222 215

10 15 17 20 21 15 16 18 21 23 17 18 20 22 22 18 20 22 24 25
20 50 50 90 90 76 66 55 45 120 80 80 60 70 150 100 93 97 101 105
120 150 100 120 103 176 166 155 85 150 85 80 70 77 135 103 90 70 120 133
BSQ
210 250 250 190 245 156 166 155 415 220 180 180 160 170 200 200 0 123 222 215

10 20 120 210 15 50 150 250 17 50 100 250 20 90 120 190 21 90 103 245

BIP
15 76 176 156 16 66 166 166 18 55 155 155 21 45 85 415 23 120 150 220
17 80 85 180 18 80 80 180 20 60 70 160 22 70 77 170 22 150 135 200
18 100 103 200 20 93 90 0 22 97 70 123 24 101 120 222 25 105 133 215
 Band sequential (BSQ) format stores
information for the image one band at a
time. In other words, data for all pixels for
band 1 is stored first, then data for all
pixels for band 2, and so on.

 Value=image(c, r, b) column row band

 Band interleaved by pixel (BIP) data is


similar to BIL data, except that the data
for each pixel is written band by band. For
example, with the same three-band
image, the data for bands 1, 2 and 3 are
written for the first pixel in column 1; the
data for bands 1, 2 and 3 are written for
the first pixel in column 2; and so on.

 Value=image(b, c, r) band column row

 Band interleaved by line (BIL) data stores


pixel information band by band for each
line, or row, of the image. For example,
given a three-band image, all three bands
of data are written for row 1, all three
bands of data are written for row 2, and
so on, until the total number of rows in
the image is reached.

 Value=image(c, b, r) column band row


NRSC SATELLITE DATA FORMATS

• LGSOWG (Landsat Ground Station Operator Working


Group) - BIL, BSQ
• Fast format
• TIFF (Tagged Image File Format)
• GEOTIFF (GEOgraphic TIFF)
• LEVEL 1B and KLM (NOAA-K(15),NOAA-L(16), NOAA-
M) for NOAA
• HDF 4 & HDF 5 (Hierarchical Data Formats)
• NetCDF( Network Common Data Form) for ROSA,
SARAL and Megha Tropiques
NASA Data Formats

• HDF 5
• HDF EOS 5
• NetCDF Classic
• NetCDF-4/HDF5 File Format
• OGC KML( Keyhole Markup Language)
• ASCII File Format Guidelines for Earth Science Data
What is TIFF
• TIFF (Tagged Image File Format), is Aldus-
Adobe’s Public domain tag-based file format
for storing and interchanging raster images.
• TIFF is a rich format for raster image data
from sources such as:
– remotely sensed imagery
– scanners
– painting, drawing, and rendering output from CAD
– Results of geographic analysis.
TIFF File Structure
The TIFF format has a three-level
hierarchy. From highest to lowest,
the levels are:

1) A file Header.
2) One or more directories called
IFDs (Image File Directories),
containing codes and their data, or
pointer to the data.
3) Data.
GeoTIFF

The GeoTIFF spec. defines a set of


TIFF tags provided to describe all
"Cartographic" information
associated with TIFF imagery that
originates from satellite imaging
systems, scanned aerial
photography, scanned maps, digital
elevation models, or as a result of
geographic analysis.
Features of GeoTIFF
• GeoTIFF fully complies with the TIFF
specifications
• GeoTIFF uses a set of reserved TIFF tags
to store a broad range of georeferencing
information, catering to geographic as
well as projected coordinate systems
needs.
• Mechanism for adding further
international projections, datums and
ellipsoids has been established
HDF
• HDF stands for Hierarchical Data Format. It is a library and
multi-object file format for the transfer of graphical and
numerical data between machines.
:
• It is versatile. HDF supports several different data models.
Each data model defines a specific aggregate data type and
provides an API for reading, writing, and organizing data and
metadata of the corresponding type. Data models supported
include multidimensional arrays, raster images, and tables.

• It is portable. HDF files can be shared across most common


platforms, including many workstations and high performance
computers.
HDF 4 HDF 5
Has many objects, and lacks a Model is "simpler" in the
Data Model clear object model sense that it has fewer objects
and has a consistent object
model throughout.
Objects are simpler, requiring Very powerful, but usually
Object Model less programming to requires the user program to
accomplish simple tasks. handle many routine details.

Objects HDF4 Data Model has eight HDF5 Data Model has two
basic objects primary objects
HDF4 has interfaces for the Equivalent operations may
six types of objects supported, require many more calls to
API which allow a user program to HDF5, even for very simple
perform common operations in tasks.
a few calls.
LGSOWG ( Super Structure)

•LGSOWG(Landsat Ground Station Operators Working Group) is


applicable to all types of products(RAD and SYSTEM Corrected)

•Video data is provided in BIL or BSQ formats.

•This format is most suitable for Radiometrically corrected (RAD) roducts.

•It gives Calibration information, ephemeris and attitude(Roll, Pitch,Yaw)


information which can be used for further level of processing.
•.
LGSOWG DIGITAL DATA FORMAT

•VOLUME DIRECTORY FILE


•LEADER FILE
•IMAGE FILE(S)
•TRAILER FILE
•NULL VOLUME DIRECTORY FILE
Fast Format
Header File
Administrative Record
Radiometric Record
Geometric Record
Image File
Image Data stored in BSQ Format
NetCDF Format
1.Network Common Data Form
2.Mainly used for non Imaging sensors
data sets
3.Specially useful for array oriented
scientific data sets
4.Machine independent format
5.Specific libraries are available in all
programming languages to develop
software for reading / writing format
Structure of data product
• CDINFO file
• Product1 folder - All files will be in this folder

- 132849100101.HDF
- 132849100101.JPG
- 132849100101.META
Sample CDINFO file
• PRODUCT 1 :
• Product Number : 132849100101
• Satellite ID : O2
• Sensor : OCM
• Path-Row : 009-013
• Date of Acquisition : 27FEB2013
• Product Code : STLCCL2HJ
• Orbit Number : 2278
• Scan Lines : 6026
• Pixels : 3730
• Bytes Per Pixel :4
• Image Record Length(Bytes) : 14920
• No of Volume : 1/1
Module-1 Digital Image Processing
Digital Data:
• Introduction
• Satellite data acquisition –Storage and retrieval – Data
Formats
• Compression
• Satellite System – Data products – Image processing
hardware and software

Image Rectification and Restoration:


• Geometric correction
• Radiometric correction
• Noise removal
Compression
Lossless compression
Lossy compression
Module-1 Digital Image Processing
Digital Data:
• Introduction
• Satellite data acquisition –Storage and retrieval – Data
Formats
• Compression
• Satellite System – Data products – Image processing
hardware and software

Image Rectification and Restoration:


• Geometric correction
• Radiometric correction
• Noise removal
Satellite Data Products
High Resolution (5.8m and better)
• Cartosat-2 (1m) System Corrected Georeferenced/ Orthokit
• Cartosat-1 (2.5m) Geo/ Orthokit, Stereo Orthokit, DEM
• RISAT (1-8m) Standard Full Scene
• Resourcesat-1&2 LISS-IV FMX, SMX (5.8m) Geo/ Orthokit,
Orthocorrected

Medium Resolution (5.8 m to 56 m)


• Resourcesat-1&2 LISS-III (23.5m) Geo/ Ortho
• Resourcesat-1&2 AWiFS (56m) Quadrant Geo/ Ortho

Low Resolution (360 m and coarser)


• Oceansat 1&2 OCM (360m) Full Scene, Geophysical
Satellite Ocean Data and
Data Products

Include those from satellite sensors


that measure
• SST(Sea Surface Temperature)
• Ocean color
• Ocean winds, and
• Sea surface height.
Module-1 Digital Image Processing
Digital Data:
• Introduction
• Satellite data acquisition –Storage and retrieval – Data
Formats
• Compression
• Satellite System – Data products – Image processing
hardware and software

Image Rectification and Restoration:


• Geometric correction
• Radiometric correction
• Noise removal
Image Processing System Considerations

A digital image processing system consists of the


computer hardware and the image processing software
necessary to analyze digital image data.

Consideration: Interactive versus Batch Processing


* Quality of the Interface

Jensen, 2003
ENVI Interface

Jensen, 2003
ERDAS Interface

Jensen, 2003
Image Processing System
Hardware /Software Considerations
* Central Processing Unit
* Operating System and Applications Software
* Random Access Memory (RAM)
* Mass Storage (hard disk, CD, DVD)
* Arithmetic Co-processor
* Image Processing Memory (Graphics card)
- CRT Screen display resolution
- CRT screen color resolution
* Serial versus Parallel Processing Jensen, 2003
Moore’s Law

Jensen, 2003
Image Processing System Considerations

Type of Computer:
* Mainframe (> 32-bit CPU)
* Workstation (> 32-bit CPU)
* Personal Computers (16 to 32-bit CPU)
Jensen, 2003
Image Processing System
Hardware /Software Considerations

* Central Processing Unit (CPU)


* Random Access Memory (RAM)
* Mass Storage (hard disk, CD, DVD)
* Arithmetic Co-processor
* Image Processing Memory (Graphics card)
- CRT Screen display resolution
- CRT screen color resolution
* Serial versus Parallel Processing
Jensen, 2003
Typical Digital Image
Processing Laboratory

Jensen, 2003
Remote Sensing Data Formats
* Band interleaved by line (BIL)
* Band interleaved by pixel (PIP)
* Band sequential (BSQ)
* Run-length encoding

Jensen, 2003
Image Processing System
Hardware /Software Considerations
* Central Processing Unit (CPU)
* Random Access Memory (RAM)
* Mass Storage (hard disk, CD, DVD)
* Arithmetic Co-processor
* Image Processing Memory (Graphics card)
- CRT Screen display resolution
- CRT screen color resolution
* Serial versus Parallel Processing
* Storage and archiving capability Jensen, 2003
Serial Versus Parallel Processing
* Requires more than one CPU
* Requires software that can parse (distribute) the
digital image processing to the various CPUs by
- task, and/or
- line, and/or
- column.
Jensen, 2003
Jensen, 2003
Remote Sensing Data
Storing and Archiving
Considerations

* Type of Media
* Access to a read-write mechanism that works.

Jensen, 2003
Components of an Image Processing
System
Longevity

Jensen, 2003
Image Processing System Functions
* Preprocessing (Radiometric and Geometric)
* Display and Enhancement
* Information Extraction
* Image Lineage
* Image and Map Cartographic Composition
*Geographic Information Systems (GIS)
* Integrated Image Processing and GIS
* Utilities

Jensen, 2003
The Major Commercial
Digital Image Processing Systems
- ERDAS
- ENVI
- IDRISI
- ERMapper
- PCI
- Lots of photogrammetric
Jensen, 2003
The Major Public
Digital Image Processing Systems
- JPL VICAR-IBIS
- MultiSpec
- C-Coast
Jensen, 2003
Image Processing Software
 ERDAS IMAGINE Arc/info live link, no conversion needed
 PCI EASI PACE Arc/Info GeoGateway for multiple formats
 Arc/Info GRID various basic raster formats, tif, sun, gis, lan,
img, bil, bip, bsq, grass, adrg, rlc
 ArcVew ERDAS lan, img, grid, tif
 ENVI/IDL imports shapefiles, e00, dxf, USGS, SDTS, dlg,
exports ArcView grid, uses own vector format
 ERMAPPER various raster formats, import of dxf and
SeisWorks, uses own vector format
 MATLAB Image Processing Toolkit
 OpenCV of Intel img, tif, jpg, jp2(jpeg2000), png, pgm
 Scikit Image
 Scilab
 other packages: CVIPtools, IDRISI, ILWIS, OpenJ, GraphicsMagick,
 blippAR...
158
of
36
Key Stages in Digital Image Processing

Image Morphological
Restoration Processing

Image
Segmentation
Enhancement

Image Object
Acquisition Recognition

Representation
Problem Domain
& Description
Colour Image Image
Processing Compression
Module-1 Digital Image Processing
Digital Data:
• Introduction
• Satellite data acquisition –Storage and retrieval – Data
Formats
• Compression
• Satellite System – Data products – Image processing
hardware and software

Image Rectification and Restoration:


• Geometric correction
• Radiometric correction
• Noise removal
Geometric Distortions
• Satellite related

• Sensor related

• Earth related
Geometric Corrections
• Changes the location of pixels within
image

• Geometric transforms are


– Scaling, Translation and Rotation

• Resampling techniques are


Nearest Neighbour, Bilinear
interpolation, Cubic convolution,
Kaiser 16 etc
Earth Model
• Earth rotational effect
• Panoramic effect
• Earth curvature effect

Laboratory for Remote Sensing Hydrology and Spatial Modeling,


3/5/2020 Dept of Bioenvironmental Systems Engineering, National Taiwan Univ.
162
Earth Rotational Effect
• The earth rotates at a constant angular
velocity, e. While the satellite is moving
along its orbit and scanning orthogonal to it,
the earth is moving underneath from west to
east.
• Since satellites such as Landsat and SPOT
have an orbit inclination angle i (about 99) in
order to achieve the desired revisit period
and sun-synchronism, the earth rotation is
not parallel to the cross-track scans.

Laboratory for Remote Sensing Hydrology and Spatial Modeling,


3/5/2020 Dept of Bioenvironmental Systems Engineering, National Taiwan Univ.
163
L = length of the image frame
0 = angular velocity of the satellite
e = Earth rotational (angular) velocity
ve = Earth surface velocity
 = latitude of the target pixel
re = Earth radius (6.37816106 m = 6378 km)
ta = Image acquisition time
 = offset distance on the earth surface
i = orbit inclination angle
3/5/2020
Laboratory for Remote Sensing Hydrology and Spatial Modeling,
164
Dept of Bioenvironmental Systems Engineering, National Taiwan Univ.
Laboratory for Remote Sensing Hydrology and Spatial Modeling,
3/5/2020 Dept of Bioenvironmental Systems Engineering, National Taiwan Univ.
165
Earth Rotation

• Landsat M S S :

– Standard image is 185 km square


∗ and consists of 2340 scan lines
– Distortion of whole image is about 10 km.
∗ Which is about 4.6 m per scan line
Image corrected for Earth Rotation
Laboratory for Remote Sensing Hydrology and Spatial Modeling,
3/5/2020 Dept of Bioenvironmental Systems Engineering, National Taiwan Univ.
168
L
ta 
re0 ve  (re cos  )e
L e
  ta ve   (re cos  )e  L cos 
re0 0
• The adjusted offset distance is

   cos(i  90 ) 
For Landsat or SPOT

   cos(9 )  0.98769 

Laboratory for Remote Sensing Hydrology and Spatial Modeling,


3/5/2020 Dept of Bioenvironmental Systems Engineering, National Taiwan Univ.
169
Panoramic Effect
• For scanners used on spacecraft and aircraft
remote sensing platforms the angular IFOV is
constant. As a result the effective pixel size
on the ground is larger at the extremities of
the scan than at nadir.
• By placing the pixels on a uniform display grid
the image will suffer an across track
compression.

Laboratory for Remote Sensing Hydrology and Spatial Modeling,


3/5/2020 Dept of Bioenvironmental Systems Engineering, National Taiwan Univ.
170
Whisk-broom Sensor
Panoramic Distortion
(Horizontal Surface)

Across track distortion

Laboratory for Remote Sensing Hydrology and Spatial Modeling,


3/5/2020 Dept of Bioenvironmental Systems Engineering, National Taiwan Univ.
171
Panoramic distortion: Earth curvature
pc = β[h + r e (1 − cos φ)] sec θ sec(θ +
φ → 0 : pc → βh sec2 θ
φ)

• NOAA AVHRR

– pixels 2.89 times bigger


if earth flat
– pixels 4.94 times
bigger
if earth round
• The cross-track displacement of a pixel can be
determined by calculating the compression
ratio arc ( SN ) h 
 
TN h tan  tan 
• The displacement for the pixel at the swath
edge is
Swath widt h   
 1  
2  tan   max

Laboratory for Remote Sensing Hydrology and Spatial Modeling,


3/5/2020 Dept of Bioenvironmental Systems Engineering, National Taiwan Univ.
173
  
• In case of Landsat 1, 2, and 3, .    0.9966
 tan   max
This indicates that a pixel at the swath edge
(92.5 km from the sub-nadir point) will be
314m out of position along the scan line
compared with the ground if the pixel at nadir
is in its correct location.

Laboratory for Remote Sensing Hydrology and Spatial Modeling,


3/5/2020 Dept of Bioenvironmental Systems Engineering, National Taiwan Univ.
174
Along track distortion

Laboratory for Remote Sensing Hydrology and Spatial Modeling,


3/5/2020 Dept of Bioenvironmental Systems Engineering, National Taiwan Univ.
175
Scan nonlinearity
Scan Angle

time

Landsat M S S :
395 m or 5 pixels
Non-square pixels

One
Pixel

Spacecraft
Motion
Scan
Correcting geometric distortions

(x,y)
( u(x,y), v(x,y)
y v )

Map Plane Image Plane


(we want new pixels here) (We have pixels here)

x u

For each new pixel (x, y ) in the map plane we find the equivalent point
(u, v ) in the image plane. W e need to know the functions u(x, y), v(x, y )
– we can get these in two ways

• Assume functional form and fit parameters


• Understand the nature of the distortion
Interpolating

• Nearest neighbour

– Use value from pixel nearest to u, v


– Tha t is pixel f in the figure
– Simple and fast
– New pixel vectors were in old
image
– better for classification
• Linear interpolation e f

p q
– Use four pixels surrounding u, v v
– Interpolate between e and g → p,
g h

f and h → q, then between p and q


– Smoother result
• Bicubic splines

– Uses all 16 pixels shown in figure. u

– Even smoother result, but more complex


– May generate spurious bright and dark
pixels
Mapping functions
There are two approaches to the mapping functions.

• Understand their form

– Using the details discussed earlier


– Good if mapping functions are simple
– Often done by suppliers before you get the
image.
• Approximate them with polynomials

– Coefficients obtained by fitting to control points


– Useful where many distortion effects are
combined
– So mapping functions are very complex
– Limitations if distortion is very strong
– Can be used to register one image to another
Mapping Polynomials

u = a0 + a 1 x + a 2 y + a 3 xy + a 4 x 2 + a 5 y 2
v = b0 + b1x + b2y + b3xy + b4x2 + b5y2
W e find the a and b coefficients by choosing a number of points which
we can identify in both the map and the image. These are called
CONTROL POINTS.

At each control point we know u, v, x and y. The first equation above


gives one equation with 6 unknown a coefficients. If we have 6 control
points, we have 6 equations with 6 unknown a coefficients.
Mapping Polynomials

u = a0 + a 1 x + a 2 y + a 3 xy + a 4 x 2 + a5y 2
v = b0 + b1x + b2y + b3xy + b4x2 + b5y2

Usually choose more than 6 control points and use least squares
solution. The second equation gives another set of equations for the b
coefficients which are solved exactly like the a set.
Geometric Rectification
Module-1 Digital Image Processing
Digital Data:
• Introduction
• Satellite data acquisition –Storage and retrieval – Data
Formats
• Compression
• Satellite System – Data products – Image processing
hardware and software

Image Rectification and Restoration:


• Geometric correction
• Radiometric correction
• Noise removal
Radiometric correction
Radiometric correction
Radiometric Correction
Correction is made on the brightness (gray level)
values of the image.
Source of errors to be corrected:
atmospheric degradation
sensor malfunctions
Illumination-view geometry

Corrections are usually different


for each band, and in theory for each pixel

Attempts to correct data may themselves introduce errors

Campbell 10.4
Radiometric Corrections
1. Correction for detector errors
• Line drop
• Destriping
(both random and periodic)
2. Sun angle correction
– for comparison and mosaic images acquired from different
time of the year
3. Atmospheric corrections
• Histogram adjustment
• Atmospheric radiative transfer models
4. Conversion from DN to radiance
5. Conversion from radiance to reflectance
6. BRDF corrections
Sensor corrections
Line Dropout
43 47 51 57
40 46 50 54
0 0 0 0
38 40 42 50

Mean from above


Solution: and below pixels
43 47 51 57
40 46 50 54
39 43 46 52
38 40 42 50

Or use other spectral band

Images: Lillesand-Kiefer
Campbell 10.4
Sensor corrections

Striping

Local averaging

Normalization

Images: Lillesand-Kiefer
Campbell 10.4
Errors: Sensor Failure & Calibration
Sensor problems show as striping or missing lines of data:
Missing data due to sensor failure results in a line of DN values -
every 16th line for TM data .. As there are 16 sensors for each
band, scanning 16 lines at a time (or 6th line for MSS).

MSS 6 line banding – raw scan


TM data – 16 line banding

MSS 6 line banding - georectified


Sample DNs – shaded DNs are higher
Radiometrically corrected images
Input image Corrected image
Radiometrically corrected images

Pre anomaly

Error first Post


noticed anomaly
during May,
2003

Post
anomaly
after
correction
algorithm
Radiometric Corrections
1. Correction for detector errors
• Line drop
• Destriping
(both random and periodic)
2. Sun angle correction
– for comparison and mosaic images acquired from different
time of the year
3. Atmospheric corrections
• Histogram adjustment
• Atmospheric radiative transfer models
4. Conversion from DN to radiance
5. Conversion from radiance to reflectance
6. BRDF corrections
Sun angle correction
• Position of the sun relative to the
earth changes depending on
time of the day and the day of
the year
Zenith
• Solar elevation angle: Time- and
location dependent

• In the northern hemisphere the


solar elevation angle is smaller
in winter than in summer

• The solar zenith angle is equal to


90 degree minus the solar
elevation angle

• Irradiance varies with the


seasonal changes in solar
elevation angle and the changing
distance between the earth and
sun
Sun angle correction
• An absolute correction involves dividing
the DN-value in the image data by the sine
of the solar elevation angle

• Size of the angle is given in the header of


the image data
Sun angle correction

Landsat 7 ETM+ color infrared composites acquired with different sun angle.
(A) The left image was acquired with a sun elevation of 37° and right image
(B) with a sun elevation of 42°. The difference in reflectance is clearly shown.
(C) (B) The left image was corrected to meet the right image.
Spectral Irradiance & Earth-Sun Distance

An astronomical unit is equivalent to the mean distance between


the earth and the sun, approximately 149.6 ×106 km
The irradiance from the sun decreases as the square of the earth-
sun distance
Radiometric Corrections
1. Correction for detector errors
• Line drop
• Destriping
(both random and periodic)
2. Sun angle correction
– for comparison and mosaic images acquired from different
time of the year
3. Atmospheric corrections
• Histogram adjustment
• Atmospheric radiative transfer models
4. Conversion from DN to radiance
5. Conversion from radiance to reflectance
6. BRDF corrections
Radiometric Correction

Direct Illumination
Radiometric Correction

Direct Illumination
Scattered Illumination
Radiometric Correction

Direct Illumination
Scattered Illumination
Adjacent Reflections

Path Radiance

Irradiance stems from


two sources: (1) directly
reflected sunlight and (2)
diffuse skylight, which is
sunlight that has been
previously scattered
by the atmosphere.
Radiometric Correction
Atmospheric Corrections

1) Histogram adjustment
• Clear sky
• Hazy sky

2) Physical Models

Campbell 10.4
Atmospheric Interference: clouds
clouds affect all visible and IR bands, hiding features twice: once with the
cloud, once with its shadow. We CANNOT eliminate clouds, although we
might be able to assemble cloud-free parts of several overlapping scenes (if
illumination is similar), and correct for cloud shadows (advanced).

[Only in the microwave,


can energy penetrate
through clouds].
Simple Atmospheric Corrections – Histogram Adjustment
Clear Atmosphere

Cloud shadowed region and water


Narrow range of bodies have very low reflectance
brightness values in infrared bands. This should give
a peak near zero on the histogram.
Small atmospheric
contribution to
The shifted peak is due to
brightness the low reflectance regions
with atmospheric scattering.

A correction can be obtain by


removing this value from
all pixels.
This method is called the
Brightness values Histogram Minimum Method (HMM)

Darkest values near zero

Campbell 10.4
Simple Atmospheric Corrections – Histogram Adjustment

Hazy Atmosphere

Wide range of
brightness values
In this case, the minimum
Added brightness
of atmosphere value is higher, and the
histogram shape has changed

Brightness values

Darkest values far from zero

Campbell 10.4
Atmospheric Correction Models
Physical models simulate the physical process of scattering
at the level of individual particles and molecules
Absorption by gases
scattering by aerosols

LOWTRAN 7
MODTRAN
CAM5S, 6S

Complex models that need many


meteorological data as input.
The data may not always be available

Campbell 10.4
Atmospheric Effects
TE
LAPP   LP

Lapp=apparent radiance
measured by sensor

ρ = reflectance of object
T = atmospheric transmittance
E = irradiance on object,
incoming
Lp = path radiance/haze, from
the atmosphere and not from
the object
Radiometric Correction
Rayleigh Scattering

•Rayleigh Scattering
•Caused by particles much smaller
than a wavelength
•Declines with the fourth power of
the wavelength
•Responsible for blue skies, red
sunsets
•Key element in radiometric
correction of images
Haze Reduction
• Aerial and satellite images often contain haze.
Presence of haze reduces image contrast and
makes visual examination of images difficult.
• Due to Rayleigh scattering
– Particle size responsible for effect smaller
than the radiation’s wavelength (e.g. oxygen
and nitrogen)
• Haze has an additive effect resulting in higher
DN values
• Scattering is wavelength dependent
• Scattering is more pronounced in shorter
wavelengths and negligible in the NIR
Haze Reduction
• One means of haze compensation in
multispectral data is to observe the radiance
recorded over target areas of zero reflectance
• For example, the reflectance of deep clear
water is zero in NIR region of the spectrum
• Therefore, any signal observed over such an
area represents the path radiance
• This value can be subtracted from all the
pixels in that band
Haze Reduction

Lapp=apparent radiance at sensor


ρ = target reflectance
T = atmospheric transmittance
E = incident solar irradiance
Lp = path radiance/haze
Haze Reduction

(a) Before haze removal (b) After haze removal


High-altitude normal color air photo of redwood stands and open grass areas
in Redwood Creek Basin, California.
Haze-Example Indonesia

(a) Before haze removal (b) After haze removal


Haze removal

(a) The aerial image before haze removal (b) The aerial image after haze removal
Radiometric Corrections
1. Correction for detector errors
• Line drop
• Destriping
(both random and periodic)
2. Sun angle correction
– for comparison and mosaic images acquired from different
time of the year
3. Atmospheric corrections
• Histogram adjustment
• Atmospheric radiative transfer models
4. Conversion from DN to radiance
5. Conversion from radiance to reflectance
6. BRDF corrections
Atmospheric Corrections

ET
Ltot   Lp Ltot= radiance measured
 by the sensor
 = reflectance of the target
E = irradiance on the target


L tot  L p   T = transmissivity
of the atmosphere
ET Lp= path radiance (radiance
due to the atmosphere)

L & K 7.2
Atmospheric Corrections

E0 coss
E = ----------------
d2

E0 = solar irradiance at the mean Earth-Sun distance

s =solar zenith angle


d = relative deviation of Earth-Sun distance from the
mean distance at the time of imaging

L & K 7.2
Radiometric Corrections
1. Correction for detector errors
• Line drop
• Destriping
(both random and periodic)
2. Sun angle correction
– for comparison and mosaic images acquired from different
time of the year
3. Atmospheric corrections
• Histogram adjustment
• Atmospheric radiative transfer models
4. Conversion from DN to radiance
5. Conversion from radiance to reflectance
6. BRDF(Bidirectional Reflectance Distribution Function) corrections
What is a BRDF?
Typical light-matter interaction scenario:
Reflected Light Incoming Light

Scattering and Emission


Internal Reflection Absorption
Transmitted Light

3 types of interaction: transmission, reflection,


and absorbtion
Light incident at surface = reflected + absorbed +
transmitted
BRDF describes how much light is reflected
What is a BRDF?
Viewer/light position dependency
(incoming/outgoing rays of light)
Example – Shiny plastic teapot with point light
Different wavelengths (colors) of light may
be absorbed, reflected, transmitted differently
Positional variance – light interacts differently
with different regions of a surface, e.g. wood
BRDF must capture this view and light
dependent nature of reflected light
What is a BRDF?
In functional notation:
BRDF  i , i , o , o , u, v 

Or
BRDF  i , i , o , o 

For position invariant BRDF’s


Differential Solid Angles
More appropriate to speak of light in terms
of quantity of light arriving at or passing
through a certain area of space
Light doesn’t come from a single direction
More appropriate to consider a small region of
directions Incoming light direction

wi
Normal

Small area

Neighborhood
of directions
Small surface element
Bidirectional Reflectance Distribution Function
(BRDF) Correction
Structures like trees cast shadows that change the amount of
light that reaches a sensor depending on its
view zenith angle

To compare pixel reflectance from different images,


or even different part of an image, the target (pixel)
reflectance must be measured under the same
view and solar geometry.

Solar Zenith Angle Sensor


(SZA) View Zenith Angle (VZA)
Some BRDF models

CCRS uses a modification of Roujean’s model


for BRDF corrections of AVHRR data
(Roujean + hotspot from 4-Scale, Chen and Cihlar, 1997)

GORT (Li and Strahler)

4-Scale (Chen and Leblanc)


Atmospheric correction: examples
Atmospheric correction of Landsat TM images
(Liang et al., 2001)

before

afte
r

Centre for Geo-


information

You might also like