Professional Documents
Culture Documents
GE OINFORMATICS
NITIN CHAUHAN
D E PA R T M E N T O F R E M O T E S E N S I N G
1
10/7/2016
REMOTE SENSING
• “the measurement or acquisition of
information of some property of an
object or phenomenon, by a
recording device that is not in
physical or intimate contact with
the object or phenomenon under
study” (Colwell, 1997).
2
10/7/2016
ADVANTAGES OF REMOTE
SENSING
1. Provides data of large areas (Synoptic Coverage)
3. Able to obtain imagery of any area over a continuous period of time through which the any anthropogenic or
natural changes in the landscape can be analyzed
3
10/7/2016
LIMITATION OF REMOTE
SENSING
1) The interpretation of imagery requires a certain skill level
5) Distortions may occur in an image due to the relative motion of sensor and source
ENERGY TRANSFER
4
10/7/2016
ENERGY TRANSFER
• Conduction - Energy may be conducted directly from one object to another as when a pan is in
direct physical contact with a hot burner.
• Convection - The Sun bathes the Earth’s surface with radiant energy causing the air near the
ground to increase in temperature. The less dense air rises, creating convectional currents in the
atmosphere.
• For example, if the energy being remotely sensed comes from the Sun, the energy:
5
10/7/2016
ELECTROMAGNETIC RADIATION
MODELS
• Two different models can be used to understand
electromagnetic radiation creation, its propagation through
space, and its interaction with other matter :
6
10/7/2016
7
10/7/2016
• The larger the amplitude the higher the energy of the wave .
• In imaging by active sensors, the amplitude of the detected signal is used as intensity
measure.
• The amount of time needed by an EM wave to complete one cycle is called the period
of the wave.
• Thus, the frequency is the number of cycles of the wave that occur in one second.
• The longer the wavelength, the lower the frequency, and vice-versa
8
10/7/2016
• Using the wave model, it is possible to characterize the energy of the Sun which represents the initial source of most of
the electromagnetic energy recorded by remote sensing systems (except radar).
• The total emitted radiation (Mλ) from a blackbody is proportional to the fourth power of its absolute temperature. This
is known as the Stefan-Boltzmann law and is expressed as:
M T 4
• where s is the Stefan-Boltzmann constant, 5.6697 x 10-8 Wm-2K-4. Thus, the amount of energy emitted by an object
such as the Sun or the Earth is a function of its temperature.
9
10/7/2016
• In addition to computing the total amount of energy exiting a theoretical blackbody such as the Sun, we can determine its dominant wavelength (λmax) based on
Wein's displacement law:
𝒌
𝝀𝒎𝒂𝒙 =
𝑻
• where k is a constant equaling 2898 µmK, and T is the absolute temperature in kelvin. Therefore, as the Sun approximates a 6000 K blackbody, its dominant
wavelength (λmax) is 0.48 µm:
𝟐𝟖𝟗𝟖 𝝁𝒎 𝑲 𝟐𝟖𝟗𝟖 𝝁𝒎 𝑲
𝟎. 𝟒𝟗𝟖 𝝁𝒎 = 𝟗. 𝟕 𝝁𝒎 =
𝟔𝟎𝟎𝟎 𝑲 𝟑𝟎𝟎 𝑲
• The Earth approximates a 300 K (27 ˚C) blackbody and has a dominant wavelength at approximately 9.7 m.
PARTICLE MODEL OF
ELECTROMAGNETIC ENERGY
• Albert Einstein found that when light interacts with electrons, it has
a different character.
10
10/7/2016
PARTICLE MODEL OF
ELECTROMAGNETIC ENERGY
• Electrons are the tiny negatively charged particles that move around the positively charged
nucleus of an atom.
• The interaction between the positively charged nucleus and the negatively charged electron
keep the electron in orbit.
• The allowable orbital paths of electrons about an atom might be thought of as energy classes
or levels.
• An amount of energy is required to move the electron up at least one energy level
• After about 10-8 seconds, the electron falls back to the atom's lowest empty energy level or
orbit and gives off radiation.
• The energy that is left over when the electrically charged electron moves from an excited state
to a de-excited state is emitted by the atom as a packet of electro-magnetic radiation; a particle-
like unit of light called a photon
11
10/7/2016
• Niels Bohr and Max Planck recognized the discrete nature of exchanges of radiant energy
and proposed the quantum theory of electromagnetic radiation.
• This theory states that energy is transferred in discrete packets called quanta or photons.
The relationship between the frequency of radiation expressed by wave theory and the
quantum is:
ℎ𝑐
𝑄 =ℎ×𝜈 𝑄=
𝜆
SOURCES OF
ELECTROMAGNETIC ENERGY
• The Sun is a prime source of EM energy, but it is not the only one.
• All matter with an absolute temperature above zero emits EM energy because of molecular agitation.
• The global mean temperature of the Earth’s surface is 288 K . Earth’s surface features, therefore, emit EM
energy.
• The Sun emits 41% of its energy as light, 9% as less than blue and 50% as infrared radiation.
12
10/7/2016
SOURCES OF
ELECTROMAGNETIC ENERGY
• Thermonuclear fusion taking place on the surface of the Sun yields a continuous spectrum of
electromagnetic energy.
• The 5770 – 6000 kelvin (K) temperature of this process produces a large amount of relatively
short wavelength energy that travels through the vacuum of space at the speed of light.
• Some of this energy is intercepted by the Earth, where it interacts with the atmosphere and
surface materials.
• The Earth reflects some of the energy directly back out to space or it may absorb the short
wavelength energy and then re-emit it at a longer wavelength
SOURCES OF
ELECTROMAGNETIC ENERGY
13
10/7/2016
• A black-body absorbs 100% of the radiation that hits it, it does not reflect any; thus, it appears
perfectly black.
• The temperature of the black-body determines the most prominent wavelength of black-body
radiation.
• When heating up a black-body beyond 127 K (1000 ˚C) emission of light becomes dominant,
from red, through orange, yellow, and white (at 6000 K) before ending up at blue, beyond
which the emission includes increasing amounts of ultraviolet radiation
• At 6000 K a Black-body blackbody emits radiant energy of all visible wavelengths equally.
14
10/7/2016
29
ELECTROMAGNETIC SPECTRUM
• Electromagnetic spectrum is a continuous sequence of electromagnetic energy arranged
according to wavelength or frequency.
• The different portions of the spectrum are namely: gaµma rays, X-rays, UV radiation, visible
radiation (light), infrared radiation, microwaves, and radio waves.
• Each of these named portions represents a range of wavelengths, not one specific wavelength.
• The EM spectrum is continuous and does not have any clear-cut class boundaries.
15
10/7/2016
ELECTROMAGNETIC SPECTRUM
ULTRAVIOLET SPECTRUM
( G E R M A N S C I E N TI ST J O HA N N W I L HE L M R I TTE R - 1 8 0 1 )
16
10/7/2016
VISIBLE SPECTRUM
(ISSAC NEWTON - 1965- 66)
• It constitutes a very small portion of the spectrum, which is of great significance in remote
sensing.
• Limits of the visible spectrum are defined by the sensitivity of the human visual system.
• Issac Newton in 1665 and 1666 by using prism revealed that visible light can be divided into
three segments .
• Primary Additives/ Primary Colour - 0.4 to 0.5 μm (blue), 0.5 to 0.6 μm (green), and 0.6 to 0.7
μm (red)).
VISIBLE SPECTRUM
• Primary colors are defined such that no single primary can be
formed from a mixture of the other two and that all other colors can
be formed by mixing the three primaries in appropriate
proportions.
17
10/7/2016
VISIBLE SPECTRUM
• Each of the three subtractive primaries absorbs a third of the visible spectrum
INFRARED SPECTRUM
(BR ITISH ASTRON OMER WIL L IAM HERSCHEL - 1800)
• Wavelengths longer than the red portion of the visible spectrum are designated as the
infrared region.
• It extends from 0.72 to 15 μm—making it more than 40 times as wide as the visible light
spectrum.
• It consists of following regions:
• Near Infrared (0.7-1.3 µm)
• Mid Infrared (1.3-3µm) Reflective Infrared
• Far Infrared (3-15µm)- thermal Emitted Infrared
18
10/7/2016
MICROWAVE ENERGY
(SCOTTISH PHYSICIST JAMES CLERK MAXWELL AND THE GERMAN
PHYSICIST HEINRICH HERTZ)
• The longest wavelengths commonly used in remote sensing are those from about 1 mm to
1 μm in wavelength.
• The shortest wavelengths in this range have much in common with the thermal energy of
the far infrared.
• The longer wavelengths of the microwave region merge into the radio wavelengths used
for commercial broadcasts.
• Before the Sun’s energy reaches the Earth’s surface, three Remote Sensing relevant interactions
in the atmosphere happen: absorption, transmission, and scattering.
• Sensor carried by a low-flying aircraft have negligible effects of the atmosphere on image
quality.
• In contrast, energy that reaches sensors carried by Earth satellites must pass through the
entire depth of the Earth’s atmosphere so atmospheric effects may have substantial impact on
the quality of images and data that the sensors generate.
19
10/7/2016
SCATTERING
• Scattering is the redirection of electromagnetic energy by particles suspended in the atmosphere or by large
molecules of atmospheric gases.
SCATTERING
• Scattering differs from reflection in that the direction associated with scattering is
unpredictable, whereas the direction of reflection is predictable. There are essentially
three types of scattering:
• Rayleigh,
• Mie, and
• Non-selective.
20
10/7/2016
RAYLEIGH SCATTERING
(BRITISH SCIENTIST LORD J. W. S. RAYLEIGH IN THE LATE 1890S)
• Rayleigh scattering occurs when the diameter of the matter (usually air molecules) are many
times smaller than the wavelength of the incident electromagnetic radiation.
• Particles causing Rayleigh scattering could be very small specks of dust or some of the larger
molecules of atmospheric gases, such as nitrogen (N2) and oxygen (O2).
• It is the dominant scattering process high in the atmosphere, up to altitudes of 9–10 km, the
upper limit for atmospheric scattering.
21
10/7/2016
RAYLEIGH SCATTERING
• It is impossible to predict the direction in which a specific atom or molecule will emit a photon,
hence scattering.
• The energy required to excite an atom is associated with short-wavelength, high frequency
radiation.
• The amount of scattering is inversely related to the fourth power of the radiation's wavelength.
For example, blue light (0.4 µm) is scattered 16 times more than near-infrared light (0.8 µm).
RAYLEIGH SCATTERING
22
10/7/2016
RAYLEIGH SCATTERING
• The intensity of Rayleigh
scattering varies inversely with
the fourth power of the
wavelength (λ-4).
RAYLEIGH SCATTERING
• Rayleigh scattering is responsible for the blue sky. The short violet and blue wavelengths
are more efficiently scattered than the longer orange and red wavelengths.
• At sunset, observers on the Earth’s surface see only those wavelengths that pass through
the longer atmospheric path caused by the low solar elevation; because only the longer
wavelengths penetrate this distance without attenuation by scattering, we see only the
reddish component of the solar beam.
23
10/7/2016
MIE SCATTERING
• Mie scattering takes place when there are essentially spherical particles present in
the atmosphere with diameters approximately equal to the wavelength of
radiation being considered.
• For visible light, water vapor, dust, and other particles ranging from a few tenths of a
micrometer to several micrometers in diameter are the main scattering agents.
• The amount of scatter is greater than Rayleigh scatter and the wavelengths scattered
are longer.
• Pollution also contributes to beautiful sunsets and sunrises. The greater the amount
of smoke and dust particles in the atmospheric column, the more violet and blue
light will be scattered away and only the longer orange and red wavelength light will
reach our eyes.
• Mie scattering tends to be greatest in the lower atmosphere (0 to 5 km), where larger
particles are abundant.
NON-SELECTIVE SCATTERING
• Non-selective scattering is produced when there are particles in the atmosphere several times the
diameter of the radiation being transmitted.
• This type of scattering is non-selective, i.e. all wavelengths of light are scattered, not just blue, green, or
red.
• Water droplets, which make up clouds and fog banks, scatter all wavelengths of visible light equally well,
causing the cloud to appear white (a mixture of all colors of light in approximately equal quantities
produces white).
• Scattering can severely reduce the information content of remotely sensed data to the point that the
imagery looses contrast and it is difficult to differentiate one object from another.
24
10/7/2016
ABSORPTION
• Absorption is the process by which radiant energy is absorbed and converted into other forms of energy.
• Absorption of radiation occurs when the atmosphere prevents, or strongly attenuates, transmission of
radiation or its energy through the atmosphere.
ABSORPTION - OZONE
• Ozone (O3) is formed by the interaction of high-energy ultraviolet radiation with oxygen
molecules (O2) high in the atmosphere.
• Absorption of the high energy, short-wavelength portions of the ultraviolet spectrum (mainly
less than 0.24 μm) prevents transmission of this radiation to the lower atmosphere.
25
10/7/2016
ABSORPTION – CARBON
DIOXIDE
• Carbon dioxide (CO2) also occurs in low concentrations (about 0.03% by volume of a dry
atmosphere), mainly in the lower atmosphere.
• Aside from local variations caused by volcanic eruptions and mankind’s activities, the
distribution of CO2 in the lower atmosphere is probably relatively uniform
• Carbon dioxide absorbs infrared radiation (IR) in three narrow bands of wavelengths, which
are 2.7, 4.3 and 15 micrometers (µm).
ABSORPTION – H 2O WATER
VAPOR
• Water vapor (H2O) is commonly present in the lower atmosphere (below about 100 km)
in amounts that vary from 0 to about 3% by volume.
• The role of atmospheric water vapor, unlike those of ozone and carbon dioxide, varies
greatly with time and location.
• Two of the most important regions of absorption are in several bands between 5.5 and 7.0
μm and above 27.0 μm; absorption in these regions can exceed 80% if the atmosphere
contains appreciable amounts of water vapor.
26
10/7/2016
ATMOSPHERIC WINDOWS
• Earth’s atmosphere selectively transmits energy of certain wavelengths.
• Those wavelengths that are relatively easily transmitted through the atmosphere are referred to as atmospheric
windows.
• Positions, extents, and effectiveness of atmospheric windows are determined by the absorption spectra of
atmospheric gases.
• Atmospheric windows define those wavelengths that can be used for forming images.
• Energy at other wavelengths, not within the windows, is severely attenuated by the atmosphere and therefore
cannot be effective for remote sensing.
ATMOSPHERIC WINDOWS
Major Atmospheric Windows
0.30–0.75 μm
Ultraviolet and visible 0.77–0.91 μm
1.55–1.75 μm
Near infrared 2.05–2.4 μm
8.0–9.2 μm
Thermal infrared 10.2–12.4 μm
7.5–11.5 mm
Microwave 20.0+ mm
27
10/7/2016
• The proportions accounted for by each process depend on the nature of the surface, the
wavelength of the energy, and the angle of illumination.
• Radiation budget equation states that the total amount of radiant flux in specific
wavelengths (λ) incident to the terrain (ϕiλ) is equal to
55
Nitin Chauhan- Department of Remote Sensing
INTERACTIONS WITH
SURFACES- REFLECTION
• Reflection occurs when a ray of light is redirected as it strikes a nontransparent surface.
28
10/7/2016
INTERACTIONS WITH
SURFACES- REFLECTION
• The angle of incidence is equal to the angle of reflection .
• For visible radiation, specular reflection can occur with surfaces such as a
mirror, smooth metal, or a calm water body.
Specular
INTERACTIONS WITH
SURFACES- REFLECTION
• If a surface is rough relative to wavelength,
it acts as a diffuse, or isotropic, reflector.
29
10/7/2016
INTERACTIONS WITH
SURFACES- REFLECTION
• Reflection characteristics of a
surface are described by the
bidirectional reflectance
distribution function (BRDF).
30
10/7/2016
Transmitted radiation
𝑡=
Incident radiation
31
10/7/2016
• Fluorescence occurs when an object illuminated with radiation of one wavelength emits
radiation at a different wavelength.
• The most familiar examples are some sulfide minerals, which emit visible radiation when
illuminated with ultraviolet radiation.
• When the sun illuminates a surface at low angles (i.e., the sun is near the
horizon), many surfaces tend to preferentially reflect the horizontally polarized
component of the solar radiation.
32
10/7/2016
When the sun illuminates a surface at low angles (i.e., the sun is
near the horizon), many surfaces tend to preferentially reflect the
horizontally polarized component of the solar radiation.
Observed brightness
𝑹𝒆𝒇𝒍𝒆𝒄𝒕𝒂𝒏𝒄𝒆 =
Irradiance
33
10/7/2016
Radiometric Units
Radiant power : The radiant energy per unit time. Unit = Watt (1W = 1
joule per second).
Radiometric Units
Irradiance :It is amount of incident energy on a surface per unit area
and per unit time. Irradiance is usually expressed in Wm-2.
34
10/7/2016
Radiance
Radiance (Lλ) is the radiant
flux per unit solid angle
leaving an extended source in
a given direction per unit
projected source area in that
direction and is measured in
watts per meter squared per
steradian (W m-2 sr -1 ).
35
10/7/2016
Path 1 contains
spectral solar
irradiance (Eoλ)
that was attenuated
very little before
illuminating the
terrain within the
IFOV.
Amount of
irradiance reaching
the terrain is a
function of the
atmospheric
transmittance at
Nitin Chauhan- Department of Remote Sensing
this angle (Tθo).
36
10/7/2016
This quantity is
referred as the
downward
reflectance of the
Nitin Chauhan- Department of Remote Sensing atmosphere (Eddλ)
37
10/7/2016
38
10/7/2016
This structure and its interaction with EM energy has direct impact on
how leaves and canopies appear spectrally when recorded using
remote sensing instruments.
39
10/7/2016
40
10/7/2016
81
Visible light interaction with pigments in mesophyll
cells
Structure of leaves is highly variable depending upon species and
environmental conditions during growth.
Chlorophyll b peak
absorption is at 0.45
and 0.65 µm.
Optimum chlorophyll
absorption windows
are:
0.45 - 0.52 µm and 0.63
- 0.69 µm
41
10/7/2016
Phycoerythrin absorb
predominantly in green
region centered at about
0.55 µm allowing blue
and red light to be
reflected.
Phycocyanin absorb
primarily in green and red
regions centered at about
0.62 µm, allowing much of
blue and some of green
light (B+G=cyan)
84 Nitin Chauhan- Department of Remote Sensing
42
10/7/2016
The reason for this is that if plants absorb this high energy infrared
energy then they would become too warm and proteins will be
irreversibly denatured.
43
10/7/2016
44
10/7/2016
45
10/7/2016
46
10/7/2016
47
10/7/2016
1 2
a
a.
b.
c.
45
Plant being watered in such a manner that it can’t hold any more
water is called turgid. But if plant is not watered then the amount
stored is less then what it can potentially hold this is called relative
turgidity.
48
10/7/2016
49
10/7/2016
Almost all humanity lives on the terrestrial, solid Earth comprised of bedrock and the
weathered bedrock called soil.
Remote sensing can play a limited role in the identification, inventory, and mapping of
surficial soils not covered with dense vegetation.
Remote sensing can provide information about the chemical composition of rocks and
minerals that are on the Earth’s surface, and not completely covered by dense vegetation.
50
10/7/2016
Soil Characteristics
Soil is unconsolidated material at the surface of the Earth that serves as a natural
medium for growing plants.
Soil is the weathered material between the atmosphere at the Earth’s surface and
the bedrock below the surface to a maximum depth of approximately 200 cm
(USDA, 1998).
Soil is a mixture of inorganic mineral particles and organic matter of varying size
and composition.
Three of the universally recognized soil grain size classes are : sand , silt and
clay.
As per the U.S. Department of Agriculture each of these soil classes are
defined as:
Sand :
a) Soil particle between 0.05 and 2.0 mm in diameter
b) Soil is composed of a large fraction of sand size particles.
c) Because of large particles they enhance soil drainage i.e. water percolate freely in air
spaces.
51
10/7/2016
Clay:
a) Soil particle < 0.002 mm in diameter
b) Soil is composed of a large fraction of clay size particles
c) They enhance movement and retention of soil capillary water.
d) Clay carries electric charge which hold the nutrient in the soil and help to maintain
soil fertility.
Soil Texture
It is the relative proportion of sand , silt and clay in a soil.
The soil texture can be identified using the USDA soil texture triangle
which identifies the % of sand , silt and clay comprising standard soil
types.
For eg : Loam soil found in the lower center of the diagram consist of
40 % sand, 40% silt and 20 % clay.
52
10/7/2016
Soil Texture
105
Nitin Chauhan- Department of Remote Sensing
𝑳𝒕 = 𝑳𝒑 + 𝑳𝒔 + 𝑳𝒗
53
10/7/2016
54
10/7/2016
Surface roughness.
55
10/7/2016
56
10/7/2016
Dry fine textured clayey soil should produce higher spectral response
throughout visible and NIR portion versus silt and sand surfaces provided
that they contain no moisture, organic content and iron oxide.
If these things are added to clayey soil it absorb incident radiant flux and
may appear like silt or even perhaps sand which might cause interpretation
problems.
Dry sand with well drained large grains should diffusely scatter the incident
wavelength of visible and NIR energy more than clayey and silt soils.
57
10/7/2016
Lv is the radiance that penetrates the air-water interface, interacts with the
organic/inorganic constituents in the water, and then exits the water
column without encountering the bottom. It is called subsurface volumetric
radiance and provides information about the internal bulk characteristics
of the water column
58
10/7/2016
Lp = atmospheric path
radiance
Ls = free-surface layer
reflectance
Lv = subsurface volumetric
reflectance
Lb = bottom reflectance
59
10/7/2016
Water Penetration
SPOT Band 1 (0.5 - 0.59 m) green SPOT Band 2 (0.61 - 0.68 m) red SPOT Band 3 (0.79 - 0.89 m) NIR
60
10/7/2016
61
10/7/2016
62
10/7/2016
63
10/7/2016
64
10/7/2016
65
10/7/2016
Frame Camera
It exposes a square shape area of the ground all at one instant of
time.
Film is held flat against the focal plane of camera and entire film
format is exposed at one time.
66
10/7/2016
Frame Camera
67
10/7/2016
68
10/7/2016
Panoramic camera
In panoramic cameras the ground areas are covered by
either rotating the camera lens or rotating a prism in
front of the lens
Panoramic camera
It takes a sweeping picture of the ground from right to left.
Camera with a rotating prism design contain a fixed lens and a flat film
plane. Scanning is accomplished by rotating the prism in front of the
lens
69
10/7/2016
Panoramic camera
70
10/7/2016
Digital Camera
Small-format digital and film cameras have a similar outward
appearance but they are totally different on the inside.
The film in a film camera acts as image capture, display, and storage
medium.
Digital Camera
Each digital detector receives an electronic charge when exposed to
electromagnetic energy, which is then amplified, converted into digital
form, and digitally stored on magnetic disks or a flash memory card.
71
10/7/2016
Digital Camera
CCD detectors are analog chips that store light energy as electrical
charges in the sensors. These charges are then converted to voltage
and subsequently into digital information.
CMOS chips are active pixel sensors that convert light energy directly
to voltage.
CCD – Better resolution but large size
CMOS – Acceptable size but average image quality
Digital Camera
72
10/7/2016
Classificatio Angle of
The lens assembly, also called lens n
Focal length
Coverage
cone, consists of the camera lens ,
the diaphragm, the shutter and the 304.8 mm / Less than
Narrow
filter. The diaphragm and the 12 inch 60o
73
10/7/2016
The focal plane contains fiducial marks, which define the fiducial
coordinate system that serves as a reference system for metric
photographs.
The fiducial marks are either located at the corners or in the middle of the
four sides.
Focal Plane is a perpendicular plate aligned with the axis of the lens; it
includes a vacuum system to fix the film to the plate.
The vacuum assures that the film is firmly pressed against the
image plane where it remains flat during exposure.
74
10/7/2016
Components- Magazine
The magazine holds the film, both, exposed and unexposed.
A film roll is 120 m long and provides 475 exposures. The magazine is
also called film cassette.
75
10/7/2016
digital hard
Camera produce film
Hard copy display copy display requires
prints
computer printers
takes hours (or days) for
Access time instantaneous
processing
Sensor Classification
76
10/7/2016
Sensor Classification
Sensor type is roughly divided into two: Optical sensor and Microwave sensor.
1. Optical Sensor :
Optical sensors observe visible lights and infrared rays (near infrared, intermediate infrared, thermal
infrared).
There are two kinds of observation methods using optical sensors: visible/near infrared remote
sensing and thermal infrared remote sensing.
iii. Also, clouds block the reflected sunlight, so this method can not be observed areas under
clouds.
Sensor Classification
b) Thermal Infrared Remote Sensing :
i. The observation method to acquire thermal infrared rays, which is radiated from
land surface heated by sunlight.
ii. Also it can observe the high temperature areas, such as volcanic activities and
fires
iii. This method can observe at night when there is no cloud.
2. Microwave Sensor :
Microwave sensors receive microwaves, which is longer wavelength than visible
light and infrared rays, and observation is not affected by day, night or weather.
There are two types of observation methods using microwave sensor: active and
passive.
77
10/7/2016
Passive sensors can only be used to detect energy when the naturally
occurring energy is available.
For all reflected energy, this can only take place during the time when the
sun is illuminating the Earth. There is no reflected energy available from
the sun at night.
78
10/7/2016
The sensor emits radiation which is directed toward the target to be investigated.
The radiation reflected from that target is detected and measured by the sensor.
Advantages for active sensors include the ability to obtain measurements anytime,
regardless of the time of day or season.
Active sensors can be used for examining wavelengths that are not sufficiently provided by
the sun, such as microwaves, or to better control the way a target is illuminated.
However, active systems require the generation of a fairly large amount of energy to
adequately illuminate targets.
Some examples of active sensors are a laser fluorosensor and a synthetic aperture radar
157 (SAR).
79
10/7/2016
Mainframe
80
10/7/2016
Here a scan mirror directs the sensor to the next pixel in cross track
direction and by scan mirror motion, one cross track of line width equal to
one pixel is imaged.
Whiskbroom
81
10/7/2016
Pushbroom
82
10/7/2016
It is composed of :
83
10/7/2016
Pushbroom Scanner
Pushbroom scanner or linear array sensor is a scanner without any
mechanical scanning mirror but with a linear array of solid
semiconductive elements (CCD, CMOS) which enables it to record one
line of an image simultaneously.
It has an optical lens through which a line image is detected
simultaneously perpendicular to flight direction.
Eg: HRV of SPOT, LISS of IRS, MESSR of MOS-1
84
10/7/2016
Pushbroom Scanner
Resolution
Resolution refers to the ability of a remote sensing system to record
and display fine spatial, spectral, and radiometric detail.
85
10/7/2016
Spatial Resolution
Spatial Resolution describes how much detail in a photographic image
is visible to the human eye.
Spatial Resolution
For example, we often speak of Landsat as having “30- meter"
resolution, which means that two objects, thirty meters long or wide,
sitting side by side, can be separated (resolved) on a Landsat image.
86
10/7/2016
Spatial Resolution
Planimetric data – roads, buildings, driveways
Spatial Resolution
80 meter MSS w/ planimetric overlay
87
10/7/2016
Spatial Resolution
30 meter TM w/ planimetric overlay
Spatial Resolution
10 meter SPOT w/ planimetric overlay
88
10/7/2016
Spatial Resolution
1 meter DOQ w/ planimetric overlay
Spatial Resolution
Sub-meter data w/ planimetric overlay
89
10/7/2016
Spatial Resolution
Looking More Closely at Resolution
Spatial Resolution
Looking More Closely at Resolution
90
10/7/2016
Spatial Resolution
Looking More Closely at Resolution
Landsat MSS
Satellite
80 Meter
Resolution Grid
Cell
Spatial Resolution
Looking More Closely at Resolution
Landsat TM
Satellite
30 Meter
Resolution Grid
Cell
91
10/7/2016
Spatial Resolution
Looking More Closely at Resolution
SPOT Satellite 10
Meter
Resolution Grid
Cell
Spatial Resolution
Looking More Closely at Resolution
IKONOS Satellite
4 Meter
Resolution
Grid Cell
92
10/7/2016
Spatial Resolution
Looking More Closely at Resolution
IKONOS Satellite
1 Meter
Resolution
Grid Cell
93
10/7/2016
Spectral Resolution
Number of spectral bands (red, green, blue, NIR, Mid-IR, thermal, etc.)
Certain spectral bands (or combinations) are good for identifying specific
ground features
Spectral Resolution
94
10/7/2016
Spectral Resolution
95
10/7/2016
Temporal Resolution
Remote sensing has the ability to record sequences of images, thereby
representing changes in landscape patterns over time.
Temporal Resolution
Time of day/season image acquisition
Leaf on/leaf off
Tidal stage
Seasonal differences
Shadows
Phenological differences
Relationship to field sampling
96
10/7/2016
Spring Summer
Temporal Resolution
97
10/7/2016
Temporal Resolution
Temporal Resolution
98
10/7/2016
Temporal Resolution
Temporal Resolution
Revisit period for satellites – how often can you make a measurement
for the same area
99
10/7/2016
Radiometric Resolution
Radiometric resolution can be defined as the ability of an imaging
system to record many levels of brightness.
fine radiometric resolution would record the same scene using many
levels of brightness.
Radiometric Resolution
The maximum number of brightness levels available depends on the
number of bits used in representing the energy recorded. Thus, if a
sensor used 8 bits to record the data, there would be 28=256 digital
values available, ranging from 0 to 255.
100
10/7/2016
Radiometric Resolution
201
DIGITAL IMAGE
An image is a picture, photograph or any form of a two-dimensional
representation of objects or a scene.
The information in an image is presented in tones or colours.
A digital image is a two dimensional array of numbers.
Each cell of a digital image is called a pixel and the number
representing the brightness of the pixel is called a digital number
(DN)
101
10/7/2016
DIGITAL IMAGE
A digital image is composed of data in lines and columns.
The position of a pixel is allocated with the line and column of its DN.
Such regularly arranged data, without x and y coordinates, are usually called raster data.
DN values do not record true brightness (known as radiances) from the scene but rather
are scaled values that represent relative brightness within each scene.
Layers are the images of the same scene but containing different information.
Some of the most important technologies for digital imaging: CCD (charge Couple Devices)
and CMOS (complementary metal oxide semiconductor).
102
10/7/2016
The images do not change with environmental factors as hard copy pictures and
photographs do.
103
10/7/2016
BSQ
104
10/7/2016
BIL
105
10/7/2016
BIP
Data Formats
Digital data products are supplied mainly in four Formats:
LGSOWG (Landsat Ground Station Operators Working Group) or Super
Structure Format
Fast Format
GeoTIFF ( Geographic Tagged Image File Format)
HDF(Hierarchical Data Format)
106
10/7/2016
107
10/7/2016
108
10/7/2016
109
10/7/2016
BSQ BIL
Fast Format
Fast Format is suitable for Level 2 products
Its consists of two files namely
Header File: Header file contains header data in ASCII ( American
Standard Code for Information Interchange) format
Administrative Record : Identifies the product, scene and data specifically needed to
read the imagery
Radiometric Record : Contains coefficients needed to convert DN values to Spectral
radiances
Geometric Record : Contains geographic information of the scene
110
10/7/2016
Fast Format
Image File : Image files are written in BSQ format i.e each image file
contains one band of image data
Administrative record
Geometric record
GeoTIFF
Formats such as PGM,GIF,BMP,TIFF can’t store geographic information
so they can’t be used in cartographic applications
It ties image to a known model space or map projection
It is platform independent
Mostly IRS data products are supplied in this format
111
10/7/2016
Raw Data
Different formats available :
Level 0R (L0R) – Data with no correction applied to it.
Level 1R (L1R) – Radiometrically corrected only. No
geometric collection.
Level 1Gs (L1Gs) – Radiometrically corrected and
resampled for geometric correction and registered to a
geographic map projection.
Level 1 Gst – (L1Gst) - Radiometrically corrected and
resampled for geometric correction and registered to a
geographic map projection. Data image is ortho-rectified
using DEM to correct parallax error due to local
topographic relief
223
Georeferencing
The data generated by a remote sensor is subjected to geometric
distortions.
112
10/7/2016
Georeferencing
Georeferencing can be carried out with the help of physical models
which requires variations of orbit and platform dynamics with time.
Georeferencing - GCP
It is a feature which can be uniquely identified in the image and whose map
coordinates (or geographic coordinates) are known or can be determined.
GCPs include
Road intersections
Airport runway intersection
River confluence
Prominent coastline features
Coordinates of GCPs can be found out from the existing maps or from GPS
surveys.
113
10/7/2016
Georeferencing - GCP
The coordinate system of raw image (ungeoreferenced) is expressed
in terms of pixels and scan lines (column and rows) with its origin at
upper left corner.
Georeferencing
The two CS can be related by a pair of mapping function such that
𝑢 = 𝑓 𝑥, 𝑦
𝑣 = 𝑔 𝑥, 𝑦
The mapping function can be polynomial. A first degree polynomial
can model six kinds of distortion, x any translation, x and y scale
change and skew and rotation. 1st degree equation are as follows:
𝑢 = 𝑎0 + 𝑎1𝑥 + 𝑎2𝑦
𝑣 = 𝑏0 + 𝑏1𝑥 + 𝑏2𝑦
114
10/7/2016
Georeferencing
The equation tries to find out which point in output image (corrected)
should come from (or corresponds) which point in the image
(𝒏 + 𝟏)(𝒏 + 𝟐)
𝒏𝒖𝒎𝒃𝒆𝒓 𝒐𝒇 𝑮𝑪𝑷𝒔 𝒓𝒆𝒒𝒖𝒊𝒓𝒆𝒅 =
𝟐
Georeferencing
For moderate distortion in small area linear transformation is
sufficient however for large area higher order polynomial will be
required.
But with increase in order (>3) it can produce errors in area devoid of
GCPs.
115
10/7/2016
Georeferencing
The accuracy of corrected products depends on the quality of GCPs.
The image to map registration is like laying a new rectified image in its
correct orientation on top of the old (distorted) image.
Georeferencing
Original Image
Geometrically corrected
and map projected Image
232 Nitin Chauhan- Department of Remote Sensing
116
10/7/2016
Georeferencing- Resampling
The process of determining what DN value is to be assigned to the
new pixels(or how to estimate the new pixel DN value) is known as
Resampling.
It has the advantages of simplicity and the ability to preserve the original
values of the unaltered scene—an advantage that may be critical in some
applications.
117
10/7/2016
Nearest-neighbor
resampling.
Each estimated value (.)
receives its value
from the nearest point on
the reference grid
(O)
Resampling - Bilinear
Bilinear interpolation calculates a value for each output pixel based on a
weighted average of the four nearest input pixels.
In this context, “weighted” means that nearer pixel values are given greater
influence in calculating output values than are more distant pixels.
Since bilinear interpolation creates new pixel values, the brightness values
in the input image are lost. Such changes to digital brightness values may be
significant in later processing steps.
118
10/7/2016
Resampling - Bilinear
But the data are altered more than are those of nearest-neighbor or
bilinear interpolation, the computations are more intensive, and the
minimum number of GCPs is larger.
119
10/7/2016
120
10/7/2016
Pattern Recognition
Pattern refers to the set of radiance measurement obtained in various
wavelength bands for each pixel.
These techniques have been born with the motive of replacing the
more qualitative approach of visual interpretation with the
quantitative approach for automatic identifying features in a scene.
Pattern Recognition
Pattern recognition can be classified into three categories :
121
10/7/2016
122
10/7/2016
123
10/7/2016
Informational Classes
Spectral Classes
These are groups of pixels that are uniform with respect to the
brightnesses in multispectral space.
124
10/7/2016
The relative distance between sensor and target does not affects
the results of spectral library
249 Nitin Chauhan
125
10/7/2016
Spectral library
Conifer
Conifer
Deciduous
Deciduous
Classifier
The term classifier refers loosely to a computer program that
implements a specific procedure for image classification.
126
10/7/2016
𝒊=𝟏
127
10/7/2016
128
10/7/2016
129
10/7/2016
Normalized Distance
Dnorm/ND refers to absolute value of the difference between the means
of two clusters divided by the sum of their standard deviations.
For two classes A and B, with means A and B, and standard deviations
sa and sb, the ND is defined as
Normalized Distance
It is applicable to two cluster of pixels rather than individual pixel
Larger of this distance indicates the more prominent distinction between two
cluster.
It helps to judge the quality of selected training samples between any two covers
before these samples are used in classification
130
10/7/2016
Unsupervised Classification
It can be defined as the identification of natural groups, or structures,
within multispectral data.
Unsupervised Classification
No prior information regarding the scene or cover need to be known.
Post processing steps links the spectral classes to meaningful ground
cover.
131
10/7/2016
132
10/7/2016
Next, all the remaining pixels in the scene are assigned to the nearest class
centroid.
But the classes formed by this initial attempt are unlikely to be the optimal
set of classes and may not meet the constraints specified by the analyst.
Then the entire scene is classified again, with each pixel assigned to the
nearest centroid.
133
10/7/2016
Euclidean distance b/w cluster center and all pixels is calculated and pixel is
assigned to cluster having spectral distance shortest.
Once all the pixels have been assigned to the cluster the sum of squared
error is calculated from pixels belonging to respective clusters
267 Nitin Chauhan
𝑆𝑆𝐸 = 𝐷𝑁 𝑖, 𝑗 − 𝑚𝑗 2
𝑗=1 𝑖=1
Where :
n= number of pixels enclosed in a given cluster.(Varies cluster to
cluster)
DN(i , j) = value of ith pixel in the jth cluster
mj= mean of jth cluster
134
10/7/2016
135
10/7/2016
136
10/7/2016
Supervised Classification
Supervised classification is the process of using samples of known identity
(i.e., pixels already assigned to informational classes) to classify pixels of
unknown identity (i.e., to assign unclassified pixels to one of several
informational classes).
Samples of known identity are those pixels located within training areas, or
training fields
Selection of training
samples
Result Training No
representation samples
satisfactory
Yes
Classified
Post classification
results Classification
274 processing
Nitin Chauhan
satisfactory
No
137
10/7/2016
Supervised Classification-
Training Data
Supervised Classification-
Key Characteristics of Training Areas
1. Numbers of Pixels
2. Size
3. Location
4. Number
5. Placement
6. Uniformity
138
10/7/2016
Number of training pixels also varies with image classifier. For eg.
MLC require at least 10 to 30 times the number of features for each
class.
139
10/7/2016
140
10/7/2016
141
10/7/2016
142
10/7/2016
All the pixels falling within the box is labelled to that class.
When the boxes overlap the pixels in overlap region may be labelled as
unclassified (AND)or arbitrarily assigned to one of the classes(OR).
143
10/7/2016
To solve resulting problem the analyst may taper the boxes avoiding
the overlap by interactively modifying the decision boundary in a
stepped fashion.
144
10/7/2016
𝑫𝒆 = 𝑫𝑵(𝒊, 𝒋) − 𝑪𝒊𝒋 𝟐
𝒊=𝟏
For every pixel in input image this computation is repeated m times,
each time for one of information class.
145
10/7/2016
146
10/7/2016
p(Cj/X)
p(X/Cj) = represents the=conditional
p(X/Cj) * p(C j)/p(X) of encountering pixel
probability 1
X in class Cj.
p(Cj) stands for occurrence probability of class Cj in input image
(priori)
p(X) demotes probability of pixel X occurring in input image. (priori)
147
10/7/2016
𝒎
𝑿
𝒑 𝑿 = 𝒑 × 𝒑 𝑪𝒋 2
𝑪𝒋
𝒋=𝟏
There is a contradiction viewing equation 1 and 2 that how can the
encountering probability of a specific information class p(Cj) can be
known before classification.
148
10/7/2016
149
10/7/2016
150
10/7/2016
3. Third, the analyst using supervised classification is not faced with the
problem of matching spectral categories on the final map with the
informational categories of interest
151
10/7/2016
152