You are on page 1of 98

Introduction to Remote Sensing

Remote Sensing
 The measurement or acquisition of information of some property of an
object or phenomenon, by a recording device that is not in physical or
intimate contact with the object or phenomenon under study”
 Lack of contact with object
 Sensors
o Collection of data about an object/area/phenomena
 Image
 Image interpretation
o Art and science
Simplified Information Flow
 Sun->atmosphere->target->atmosphere->sensor->image-
>interpretation>application
Remote Sensing (tool)
 Remote sensing is a tool or technique similar to mathematics. Using
sensors to measure the amount of electromagnetic radiation (EMR)
exiting an object or geographic area from a distance and then extracting
valuable information from the data using mathematically and statistically
based algorithms is a scientific activity. It functions in harmony with other
spatial data- collection techniques or tools of the mapping sciences,
including cartography and GIS
In Situ vs. Remote Sensing
 Both attempt to observe and/or measure objects and phenomena
 In-situ
 Physical contact (interaction)
 Instruments for direct measure
 Ground-reference vs. “Ground truth”
Sensing
 Data is collected by sensor not in contact with the object
 Passive-collection of reflected or emitted energy
 Active-Generates signal and collects backscatter from interaction
with surface
Distance
 No singe standard
 Platforms for sensors operate at multiple levels
 Cranes, balloons, aircraft, satellite, UAV’s
o Cranes- to look at a surface overtime
o UAV- Control over where you can take a picture, no death
risks
 Permits near-surface to global scale data collection
Data Collect
 Image data
 2-D (Spatial perspective)
 You can tell when and where the image was taken
o most satellites are taken at 10am because clouds form from
convection
Application Areas
 Land use/land cover mapping
 Photogrammetry: obtaining reliable measurements
 Useful for city planning
 Natural resource inventory and mapping
 Measuring forests
 Water quality monitoring
 Physical/biological oceanographic mapping
 Atmospheric monitoring
 Tropical storm development
 Many others
Advantages
 Different perspective
 Obtain data for large areas
 In single acquisition- it is efficient
 Synoptic
 Systematic
 Obtain data for inaccessible areas
 Military is really interested
 Doesn’t affect/interact with phenomena of interest
Disadvantages
 Accuracy and consistency
 Inconsistent- clouds
 Inaccurate-always moving
 Artifacts (processing errors)
 Scale related
 Image is too coarse/detailed
 Moving between scales: image + in situ data
 High initial outlays for equipment and training
 Not many people with the skill
Remote Sensing Process
1. Statement of the problem
 Identification of data requirements
2. Data collection
3. Data analysis
 Image processing
4. Presentation of information
 Maps, charts, statistics, report, graphs, GIS layers
Problem Solving with Remote Sensing
 User needs assessment
 Remote sensing is a tool, not an end in itself
 What are the aims/purpose/goals of the study?
 Match technology with user needs
o Operational vs. state-of-the-art system
 Established track record
 Data accuracy
 Data availability
User Needs Assessment- Data Requirements
 IF RS is the right approach…
 Space/time scales?
 What type of sensor?
 What type of platform?
 What can be user afford?
Scale Considerations
 Spatial/Temporal scales and User needs
 Spatial scale
 Spatial resolution
 Temporal scale
 Frequency of image acquisition- temporal resolution
o How often do you want that image taken?
 What is the application?
 Different processes operate at different scales…
o Pollution, Flooding, Fire
 Want the images quickly
o Urban development, resource management
Spatial Resolution
Color
 You can only see three colors so you can you display three colors
 There are multiple bands, but you can only display three colors
 True color
 Red is red, blue is blue, green is green
 Green- reflects green
Resolution- *Four Components of Resolution*
1.Spatial- the size of the field-of-view, e.g. 10x10m
 images are a series of numbers
2. Temporal- how often the sensor acquires data, e.g. very 30 days
3. Spectral- the number and size of spectral regions the sensor records data
in, e.g. blue, green, red, near-infrared thermal infrared, microwave (radar)
4. Radiometric- the sensitivity of detectors to small differences in
electromagnetic energy
Spatial Resolution
 Indication of how well a sensor records spatial detail
 Refers to the size of the smallest possible feature that can be detected as
distinct from its surroundings
 Function of platform altitude and IFOV
 IFOV (Instantaneous Field of View)
 Size of the area that the sensor “sees” at a given moment
 Ground Resolution Element (GRE)
 Smallest area resolvable on the ground
 GRE=IFOV*H
o H= platform altitude
 Pixel (picture element)
 Image=2-D (square) array of pixel
 Pixel size=spatial resolution?
 no
Spatial resolution
 Spatial resolution ‘germane to task’
 Resolution needed for effective detection/analysis of features
observed with sensors
 For an object to be detected, its size should be >= 2x’s the pixel size
 Spatial resolution is key to picking out one object from the other
 Examples: Land cover mapping
 Buildings vs. urban areas
 Forest vs. trees
 Grain and Extent
 Grain (spatial resolution): smallest object distinguishable on image
(detail)
o Spatial resolution, similar to pixel size
o High resolution means great detail
 Extent (area covered): area covered by an image
 Trade-off (in general):
o Small grain size=small area covered (small extent)
 More detail you are able to look at, smaller the extent
o Large grain size=large area covered (large extent)
Temporal Resolution
 The ability to obtain repeat coverage for an area
 Timing is critical for some applications
 Crop cycles (planting, maximum greenness, harvest)
 Catastrophic events
 Aircraft
 Potentially high
 Satellite
 Fixed orbit, systematic collection, pointable sensors
 You can go back in time
 Limited because of clouds
Spectral Resolution
 The number and dimension of the specific EMR wavelength regions to
which sensor is sensitive
 Broadband: few, relatively broad bands
 Hyper-spectral: many, relatively narrow bands
Radiometric Resolution
 Ability of a sensor to distinguish between objects of similar reflectance
 Measured in terms of the number of energy levels discriminated
 2n, where n=number of ‘bits’ (precision level)
 example: 8 bit data=28=256 levels of grey
 256 levels =0-255 range
o including zero there are 256 levels
 0=black, 255=white
 Affects ability to measure properties of objects
Resolution
 Trade-offs
 Impossible to maximize all four elements
 Meteorological satellites acquire image data daily, but at low spatial
and spectral resolutions
 LANDSAT TM & ETM+satellite acquires imagery at a higher spatial
and spectral resolutions, but lower temporal resolution
 Newer high spatial resolution sensors
 Choice of each resolution element depends on goals and objectives
Image Interpretation
Image Interpretation
 Act of examining images for the purpose of identifying and measuring
objects and phenomena, and judging their significance
 Art vs. science
o Art- “judging their significance”
o Science- “measuring objects and phenomena”
Image Interpretation-Tasks
 In order of increasing sophistication:
o Detection, Identification, Measurement, Problem-solving
 Not necessarily performed sequentially or in all
 Detection
 Lowest order
 Presence/absence of object or phenomena
 Examples; buildings, water, roads and vegetation
 Identification
 More advanced than detection
 Labeling or typing of the object/phenomena
 Tends to occur simultaneously with detection
 Examples: houses, pond, highway, grass/trees
 Measurement
 Quantification of objects/phenomena
 Direct physical measurement from the imagery
 Examples: inventories (count), Length, area and height of objects
 Problem Solving
 Most complex task
 Uses information acquired in first three tasks to put objects in
assemblages or associations needed for higher-level identification
 With experience, recognition becomes more automatic and tasks
become less distinct
Interpreter Requirements
 Vision
 Stereoscopic acuity
 Color vision
 Knowledge of application and environment
 Pattern recognition skills
 Intelligence and motivation
Performance Tests
 Standardized tests for screening
 Recognition of basic patterns and shape
 See in stereo
 Normal color vision
Interpretative Elements
 Information contained in image data
 Scale dependent and interrelated
 Elements: Tone/color, texture, size, shape, pattern, shadow,
height, context
 Tone/Color
 The most basic elements of interpretation
 Represents amount of energy returned from an object/location
 Amount of energy returned depends on:
o Reflectance of object
o Sensitivity of sensor
 Black and White (panchromatic), tone is greyness level ranging
from
o Black (no return) -> white (high return)
 Color imagery- differences in color are based on energy returned in
specific wavelength bands
o 3 dimensional tone varying in
 intensity- brightness
 Hue-dominant wavelength, “the color”
 Saturation- purity of color relative to grey
 Texture
 Characteristic placement and arrangement of repetitions in tone or
color
o Subtle changes in tone in close proximity
 Visually impressions of “roughness” or “smoothness”
o Small tonal changes appear smooth (e.g. calm water)
o Abrupt/coarse tonal changes appear rough (e.g., built up
urban areas)
 Scale dependent
o Aggregate of characteristics too small to be detected
individually (e.g., tree leaves)
o At higher resolution, objects may be discerned and
arrangement detected as pattern
 Size
 Size of object can be clue to identification
 Measured in 2 Ways:
o Relative: compare objects within an image
 e.g., distinguish apartment buildings from single-family
home, distinguish freeways from residential streets
o Absolute: determine actual dimensions of the feature if scale
of imagery is known OR if size of feature is known, the scale
of the image can be determined
 Shape
 Certain features have characteristic shapes
 Top-down view provides different perspective than profile
 Cultural features tend to have regular geometric shapes and distinct
boundaries
 Natural features tend to have les regular geometric shapes and
fuzzy boundaries
 Pattern
 Arrangement of related spatial objects
 Combines microimage elements of repetition for recognition of
characteristic phenomena
o Creates an object that is bigger than an individual picture.
They groups of pixels grouped together that makes
something.
 Examples: drainage networks, orchards, housing patterns, road
networks
 Shadow
 On imagery indicates absence of direct illumination
 Can aid analysis
o Enhances terrain relief, especially in low contrast scene
o Used to determine height of objects
o Can be used to show shape of objects
 Hinders analysis if object of interest is obscured
 Height
 Similar to size and can be used in both relative and absolute sense
 Oblique image data can provide relative measure of height, but
absolute measurements are difficult because scale varies
 More rigorous determination of height can be made from stereo
imagery
 Context
 Highest level of cognition, integrates all other elements of
interpretation
o Site- location of an object in relation to its environment
o Association- location of object in proximity to other objects
 Combination of individual objects typically found
together that allows an interpreter to make a higher
order assignment
Interpretation Process
 Search procedure and interpretation keys
 Image information
 Location, date, sensor, spatial resolution
 Collateral (ancillary) data
 General to specific
 Overall impression
 Major geographic regions- natural vs. cultural
 Approximate area of land cover/use
 Grid system to describe location of features in image
 Identify specific features
 Examination of evidence within image to converge to interpretation
Human vs. Automated Approaches
 Humans generally utilize higher order elements of interpretation, e.g.
context, size, shape more successfully than computer-qualitative
 Computers can quantify multiple levels of image tone more accurately
than human-quantitative
 Computers are less biased, but may be less accurate because of limited
ability for higher order interpretations
 Training and equipment
 Hybrid approaches

Electro-Magnetic Radiation (EMR) Spectrum
EMR as Information Link
 Link between surface and sensor
Energy Flow-Radiation
 Energy transferred between objects in the form of electromagnetic
waves/particles (light)
 Can occur in a vacuum
 Two models-Wave and Particle
Wave Theory
 Explains EMR transfer as a wave
 Waves travel through space at the speed of light: (3x108 ms-1)
o Earth to the moon is 1.28 seconds
o Sun to Earth is 8.5 minutes
Electromagnetic Wave
 Two components or fields
 E=electrical wave
 M=magnetic wave
Wave Properties
 Amplitude
 Height of the wave crest above the undisturbed position
 Related to the amount of energy carried by the wave
 Wavelength (lambda)
 Distance between successive crests or troughs
 Generally measured in micro-meters (μm,10-6 meters) and
designated by lambda (λ)

 Frequency
 Number of wave forms passing through a given point per unit time
 Measured as cycle/second (cycle s-1) or Hertz (Hz)
o Low frequency- longer wavelengths
o High frequency- shorter wavelengths

Wavelength & Frequency


 C=λv
 C=speed of light (3x108 m/s)
 Λ=wavelength
 V=frequency
 Relationship between wavelength and frequency is inverse
 V=c/λ and λ=c/v
Quantum Theory
 Wave theory doesn’t account for all properties of EMR
 Interaction of EMR with matter
 Absorption
o Takes up energy and changes the state of that object. It
heats up the object and it gives off more energy
 Emission- gives off energy
 EMR is transferred as discrete particles (photon)
 Planck’s Law (energy of photon):
 Q=hv
o Q=energy of quantum in Joules (J)
o h= Planck’s constant=6.626 x 10-34 (Js)
o v=frequency of EMR wave (cycles s-1) or (Hz)
 Q≈v
 Long wavelength & low frequency = low energy
 Red
 Short wavelength & High frequency = high energy
 Purple
Energy Levels
 All objects above -263°C (0 K) emit electromagnetic energy
 Energy radiated is a function of temperature
 Stefan Boltzmann (SB) Law
 Describes emitted radiation from a blackbody as a function of
temperature: M=σT4
o M=total emitted radiation (Wm-2)
o σ=SB constant = 5.6697 x 10-8 Wm-2 K-4
o T=temperature (K)
 Stefan-Boltzmann Law describes emitted radiation from a blackbody
as a function of temperature
o Blackbody= an ideal radiator
 Absorbs and reemits all energy
 Total energy emitted is proportional to T4
o The hotter the object, the more radiation emitted
 Wien’s Displacement Law
 Used to identify λ of maximum energy emission (λmax)
 Inversely related to temperature
o λmax=K/T
 K= constant 2898 μK
 T=temperature (K)
 Sun: T=6000K; λmax =0.48 μm
 Earth: T=300K; λmax =9.66μm
Atmospheric Interactions
 Energy detected by sensor is a function of
 Surface properties and Atmospheric influences
 Atmosphere affects transmission of EMR by:
 Scattering and Absorption
Atmospheric Windows
 Portions of the spectrum that transmit EMR effectively
 Where it can propagate through the atmosphere
Atmospheric Constituents
 Responsible for scattering and absorption
 Water droplets/ice crystals
o Clouds
 Gas molecules
o CO2, water vapor, ozone
 Aerosols
o Particles suspended in the atmosphere
o Smoke, dust, sea salt, chemical pollutants
Scattering
 Re-direction of energy
o Makes the energy move in different direction but it doesn't
change the wave length
 No change in other properties
 3 primary types:
o Rayleigh (molecular)
o Mie (aerosol)
o Non-selective
 Rayleigh (molecular)
 Radiation interacts with molecules/particles whose diameter is
much shorter than the wavelength
o <= to 1/10 the wavelength
 Atmospheric gasses: Oxygen and Nitrogen
 Amount of scattering is inversely proportional to wavelength raised
to the 4th power (λ-4): Ι α λ-4
o Blue light (0.4 μm) -> 1/0.44=39.06
o Red light (0.7μm) -> 1/0.74 = 4.16
o 39.06/4.16
Scattering for blue light is 9.36 greater than red light
o Occurs in upper 4.5 km of the atmosphere
o Accounts for blue sky and red sunset
 Mie (Aerosol) Scattering
 Particles with diameter about equal to wavelength
 Dust, pollutants, volcanic eruptions, smoke
 Affects longer wavelengths than Rayleigh scattering
 Occurs in lower 4.5km of atmosphere
 Direction/amount of scattering depends on physical characteristics
of the aerosol:
o Shape/size
o Distribution of particles
 Non-Selective Scatter
 Diameter of particles is much larger than λ (~10xλ)
 Cloud water droplets/ice crystals
o Water absorbs so clouds get darker
 Affects all wavelengths, hence “non-selective”
Scattering Effects
 Reduces direct illumination from sun and creates diffuse illumination
 Scatter can occur anywhere in information flow:
 Sun->Surface->sensor
 May add to or reduce signal received by sensor
 Filters may be used to reduce effects of haze/scatter
Path Radiance
 Radiation scattered into the sensor from the atmosphere
 Mostly shorter wavelengths
Contrast Reduction
 Atmospheric effects can reduce image contrasts
Absorption
 Radiant energy is taken in by matter and converted into other forms of
energy
 Atmospheric gases are selective absorbers with reference to wavelength
 Scattering doesn’t change the wavelength, but absorption does
 Significant Absorbers: oxygen, nitrogen, ozone, carbon dioxide, water
vapor
 Ozone-ultraviolet region
 Water vapor- specific bands in infrared
 Carbon dioxide-thermal infrared region
Radiant Energy/Radiant Flux
 Radiant Energy (Qλ)
 Capacity of radiation within a spectral band to do work (J)
 Radiant Flux (Φλ) (phi)
 Time rate of energy flow onto, off of, or through a surface per unit
time
 Measured in J/S or Watts (W)
 Characteristics of Φ and what happens to it as it interacts with Earth’s
surface
 Primary focus of remote sensing
Surface Energy Partitioning
 Radiation at surface is partitioned among:
 Absorption (αλ)
 Transmission (τλ)
 Reflection (ρλ)
 They are all wavelength dependent
Radiation Budget Equation
 Dimensionless ratios of the radiant flux absorbed, transmitted, or
reflected to the total incident radiant flux:

 Sum of Ea+ET+ER=Φiλ=1
 Ea+ET+ER vary with
o Target and wavelength
 Energy absorbed + energy transmitted + energy reflected=1
o Equals one because of conservation of energy

 Irradiance: energy coming into the area


 Exitance: energy coming off of the area

Properties Affecting Target Radiance


 EMR
 Wavelength
 Angle of incidence
 Target/surface Type
 Moisture
 Micro-roughness
 Terrain
 Target- sensor angle
 Interaction- determines target radiance detected by the sensor
Spectral Signature Concept
 Describes spectral reflectance of a target at different wavelengths of EMR
 Spectral reflectance curve-graphs reflectance response as a function of
wavelength
 Key to separating and identifying objects
 Selection of optimum wavelength bands
Reflected Energy
 Sensor operation and wavelength regions
 Energy reflected = Φ – [E/Φ]
 Reflectance: ρ=)(ERλ/Φiλ)
 Energy reflected/incident energy
Percent Reflectance
 Percent reflectance= (energy reflected / radiant flux) * 100
 Widely used in RS research to describe surface/target reflectance
characteristics
 Time rate of energy flow onto, off of, or through a surface
Surface Type
 Two types of surfaces
 Specular: the incident energy will leave surface at an angle equal
and opposite to the incident energy
o Associated with smooth surfaces; mirror-like reflections
o Object can look different over time because it reflects
differently depending on the angle of the sun.
 Diffuse: uniform reflection of incident energy in all directions
o Associated with large surface height relative to the size of λ
 Measuring only that energy from the angle
Air Photo Geometry and Stereo Viewing
Vertical Air photo
 Taking an image looking straight down out of an airplane
 Oblique
 Taking an image looking out to the side
 Low Oblique
o You cannot see the horizon (the ground is always being
covered)
 High Oblique
o you can always see the horizon
Information that can be obtained
 Vertical Airphotos
 Scale of photography
 Calculate object height
 Calculate object length
 Area of an object
 Perimeter
 Grayscale tone or color of an object
Photographic Elements
 Annotation
 Date/time
 Firm/Series
 Coordinates
 Scale
 Altitude
 Fiducials
 Minimum of 4 markers on a photo
o Placed on center of each side of photo OR
o Placed in photo corners
 Image Principal point (PP) is at the center of the picture where
the lines from the fiducial points intersect.
o This is where the camera is looking straight down

 Nadir
 Point directly below aircraft
 For a true vertical image: nadir=PP
 <=3° from Nadir
 Blocks of Aerial Photography compiled into an uncontrolled
photomosaic allows you to put the images together to create a
larger and to ensure that you are covering all of the area
 Height (H)
 Altitude of the platform (and camera system) above the terrain
 Focal Length (f)
 Distance from focal point (lens) to film plane
o Focal length calculates the height of the image (f)
 Optical axis
o Line from the focal point to the center of scene
o For a perfectly vertical photo, the optical axis, principal point,
and nadir point will line up

Scale
 Vertical Air Photo/Flat Terrain
 Similar to map scale
o Length of feature on image : Length of feature on ground
o Representative fraction (RF)
 It has to be in the same unit
RF=1:10,000 OR 1/10,000
o Can be determined
 Knowing actual length of feature visible in image OR
 Knowing (H) and (f) AND using the concept of ‘similar
triangle’ (i.e., scale=f/H)
 Note About ‘Height’
 H=Height about the terrain
o E.g., 200 meters above ground level (AGL)
 If height is stated in terms of height above sea level (H’)
o Must know height of terrain (h) and adjust H accordingly
o Height above terrain: H=H’ - h
 Flat vs. Variable Terrain
 Flat terrain: scale can be determined for entire image
 Variable terrain: scale will vary across image
 Depending on the amount of variation, an average terrain value
may be used
Image Displacement
 Orthographic projection-map
 True position
 Everything is same throughout the image
 Central projection-vertical air photo
 Object shifted from true position-displaced
 Nadir point-only true position
 Away from nadir- camera increasingly ‘sees’ object sides
 Relief
 Differences in relative elevation of objects in photo
 Significant source of image displacement
Relief Displacement
 Objects will tend to lean outward, i.e. be radially displaced
 The greater the object is from the principal point, the greater the radial
displacement
 Farther away the object is from the nadir, farther it will lean.
 Example, cooling towers towards the edge of photo show greater radial
displacement
 Relief Displacement: Radial distance between an object’s image position
and its true plan (horizontal) position due to differences in object relief
 Cause
 All objects on a vertical air photo are positioned as though viewed
from the same point
o Camera increasingly ‘sees’ the side of an object the further it
is from nadir- objects appears to ‘lean’
 Magnitude
 Object height, distance from nadir point, and H
 Displacement at nadir=0
 Points higher than datum-lean outward
 Points below datum-lean inward
Photogrammetry
 Art and Science of making accurate measurements by means of aerial
photography
 If scale is known or can be calculated
o You can calculate the length, area, perimeter, and height
 Height of objects
1. Calculate object Height
 Based on relief displacement (d), Distance from principle point (r),
& Height above ground level (H)
 h/H=d/r
o h=(dxH)/r
o H= height of aircraft
o h=height of building
 172.3
o D=relief displacement you can see in the image
2. Height From Shadow Length
 Object height (h) may be computed by measuring its shadow length
o Length of object’s shadow on a horizontal surface is
proportional to its height
o Shadows must fall on open level ground where they are
undistorted and easily measured
o Scale of imagery is known
o Shadow must be cast from true top of object
 Based on shadow length (L)
o tan= opposite/adjacent = height (h)/ shadow (L)
 h= L * tan
o You need to know the sun angle, scale of imagery, and you
need a shadow
Stereoscopic Viewing
 Provides 3rd dimension to air photo interpretation
 Identify 3-D form of an object
 Stereopairs
 Overlapping vertical photographs
 Stereoscopes
 Used to create synthetic visual response by forcing each eye to look
at different views of same terrain
 Gives perception of depth (3-D)
Stereo Viewing
 Parallax
 Apparent change in relative positions of stationary objects
 Caused by change in viewing position
 Example- looking out car window (side)
 Parallax- Air photo
 Caused by taking photographs of the same object from different
positions => relative displacement
 Relative Displacement
 Forms the basis
Photographic Requirements
 Photos must be taken at same altitude, along the same flight line
 Overlap and Sidelap
 Equipment for viewing in stereo
Stereoscopic Alignment
 Alignment of air photos
 Lined up along the flight line in order of PP, CPP, CPP, PP in a
straight line
 Overlap toward center
 Amount of separation varies depending on type of photos
 In northern hemisphere, photos are arranged with shadows towards
the observer (south at top)
Stereoscopic Parallax Principle
 Change in position of an image of an object from one photograph to the
next caused by the aircraft’s motion is the x-parallax.
 All objects in the scene at exactly the same height will have an identical
amount of x parallax
 x-parallax is directly related to the elevation of the point above the
terrain
o greater for high points than for low points
Parallax Height Measurement
 Stereoscopic-based measurement (calculating Height):
 H0 = (H-h) x (dp/(P+dp))

Remote Sensing Platforms
Types of Platforms
 Ground Based
 Hand-held/cranes
 Airborne
 Stationary
o Captive/tethered balloons
 Aircraft
o Manned and unmanned
 Satellite
 Lighter-than-air
 Free floating balloons
o Stable, but restricted by meteorological influences
o Used to acquire meteorological/atmospheric data
 Blimps/dirigibles
o Major role-news media/advertisers
 Helicopters
 Can pin-point locations
 Lack stability (vibration)
 Aircraft
 Platform most often used to acquire aerial imagery
 Requirements:
o Requisite speed
o High rate of climb
o Stability in flight
o Unobstructed view for navigation and identification of
landmarks
o Range commensurate with size of project
o Ceiling higher than highest altitude specified
o Capable of remaining in air long enough to take advantage of
suitable photographic time
o Can accommodate equipment
 Low altitude Aircraft
o Generally operate below 20,000 ft.
o Most widely used are single engine or light twin engine
o Imagery can be obtained by shooting out the window or
placing camera mount on window or base of aircraft
o Suitable for obtaining image data for small areas (large scale)
 High Altitude Aircraft
o Operate above 20,000 ft
o Includes jet aircraft with good rate of climb, maximum speed
and high operating ceiling
o Stable
o Acquire imagery for large areas (smaller scale)
o E.g., NHAP, NAPP, AVIRIS
 Advantages of Aircraft
o Acquire imagery under suitable weather conditions
o Control platform variables such as altitude
o Time of coverage can be controlled
o Easy to mobilize
 Disadvantages of Aircraft
o Expensive- primarily cost of aircraft
o Less stable than spacecraft
 Drift off course
 Motion blurring
 NAPP- National Aerial Photography Program
o Standardized set of cloud-free aerial photographs covering
the conterminous U.S. (not including Hawaii or Alaska)
 Five-to seven year cycles since 1987
 Altitude of 20,000 feet
o Most recent and consistent source of high-quality aerial
photography
o Each photo is of a
 Centered on one-quarter section 7.5 minute USGS
quadrangle
 Covers approximately a 5.5 x 5.5 mile area.
o Available in black and white (B/W) or color infrared (CIR)
 Green, red, infrared because blue scatters, which will
blur the image.
 Only the vegetation looks different
 Spacecraft
 Numerous programs
 Manned and unmanned systems
 Range
o Range for spacecraft is determined by orbit, which is fixed in
altitude and inclination
 Sun synchronous-near polar; cross equator at
approximately same local time each day
 Geostationary-fixed orbit over equator; primarily
meteorological systems
 As the earth rotates, the satellite rotates with it.
 Advantages of Spacecraft
o Stable
o Constant altitude and attitude
 Attitude- up, down, back and forth, side to side
o Generally insignificant relief displacement
 Insignificant because it is so high up
o Repetitive coverage
 Useful for looking back in time
 Disadvantages of Spacecraft
o Expensive to launch
o Support programs are expensive
o Cannot control time of imaging
o Atmospheric effects may be greater than for aircraft platform
 It’s higher, so it has to go through more atmosphere
Photographic Concepts
 Exposure: total amount of light striking film and/or Charge Couple
Devices (CCD)
 Shutter speed should be fast, but able to let in a lot of light in
 Film/CCD exposure primarily a function of :
 Scene brightness
 Diameter of lens opening
 Exposure time
Relative Aperture
 Diameter of lens opening determined by aperture setting =f/stop
 Defined as ratio of lens focal length (f) to lens opening diameter (d)
 Focal length also affects how much light is coming through
 Important for speed of the lens
 Examples:
 f=50mm, d=12.5mm => f/stop = 50/12.5 = f/4
 f=50mm, d=25mm => f/stop =50/25 = f/2
F/stop, diameter and exposure
 Focal length generally fixed, so f/stop varies as function of the diameter
of the aperture
 Inverse relationship
 smaller f/stop -> wider lens opening and higher exposure
 larger f/stop -> smaller lens opening and less exposure
Shutter Speed
 Interval of time the aperture is open
 Slow shutter speed- blurry
 Fast shutter speed- more clear
 Expressed as, e.g., 1/500 sec.
 To main constant exposure:

Field of View
 Want one fixed field of view
 Greater the camera lens angle of view, the greater amount of terrain
photographed at a constant altitude above ground level
Aerial Cameras –Frame Camera
 High geometric image quality
 Characteristics include:
 Film rolls 100-500 ft in length
 Large film format (9 in x 9 in, 23 cm x 23 cm)
 Low f/stop- fast shutter speed
 Film- can only be used to look at 3 portions of the electromagnetic
spectrum
o Two major
 True color or color infrared
Remote Sensing Platforms II
Aerial Support Hardware
 Used to improve quality of imagery by
 Reducing effect of platform motion
 Keeping attitude constant
 Image motion compensator
 Moves film in same direction as aircraft at speed proportional to
aircraft velocity
 Gyro Stabilization
 Stabilizes camera within plane to keep it pointing at nadir
 Adjusts orientation of camera if attitude of plane shifts
Aerial Cameras- Digital
 4-chip camera
 Use 4 separate full-frame CCD’s
 Each sensitive to different wavelength
 Single chip camera
 Uses single full-frame CCD
 Filter is placed over each pixel to capture red/green/blue
information
Color Theory
 Primary colors
 Red, green, and blue
o They make cyan, magenta, and yellow
 Combination of all three colors make white

 Color characteristics
 Hue- dominant λ (color)
 Saturation- purity of color
 Lightness- light/dark
 Additive process: based on ‘light’
 TV, Computer Monitors
o Each color gun’s intensity in each pixel is modulated based on
the amount of primary color present in the scene
 Subtractive process:
 Based on ‘pigments/dyes’
 Cyan, magenta, yellow
Filters
 Most aerial photography is collected with filters on the camera lens
 Block certain wavelengths of light from reaching film plane and exposing
film
 Subtracts some of the light reflected from the scene
 Example
 Yellow filter: absorbs blue (haze)
o Passes green and red
Image Contrast Enhancement
Image Contrast Enhancement
 Many materials, both natural and man-made, have similar reflectance
through the electromagnetic spectrum
 Sensors
o Detectors must be able to sense a wide range of values-snow
to dark volcanic basalt- without becoming saturated
o For sensors with high radiometric resolution a decrease in
contrast may be required if the image DN range exceeds that
of the display
 Utilize full range of video display capabilities
o Imagine 8 bit display – 256 gray scale layers
o Contrast enhancement is done for display purposes only
 Does not affect actual pixel or DN values

Contrast Enhancement
 Selection of a contrast enhancement algorithm depends on:
 Sensor radiometric resolution
 Nature of the original (raw) histogram
 Elements of the scene of greatest interests to user
Linear Contrast Enhancement
 Min-max Contrast Stretch
 DNout=((DNin - DNmin / DNmax - DNmin)) x Quant
 Makes full use of range of output device
 Quant is the range of brightness values that can be displayed (i.e., 255)
 Works best with data that have Gaussian (normal) distribution
 Expands the image DN range to fill the dynamic range of the display
device (e.g., 0-255)
 Sensitive to outliers (single pixels that may be atypical and outside the
normal DN range)
 Percentage Linear Stretch
 Uses specified Min and Max values that lie in a certain percentage
of pixels from the mean of the histogram
 A standard deviation from the histogram mean is often used –
standard deviation stretch
Nonlinear Contrast Enhancement
 Histogram Equalization
 Assign an approximately equal number of pixels to each of user-
specified gray scale cases
 Greatest contrast is applied to most populated range of DN’s. So,
least contrast applied to TAILS of histograms.
o Determine number of output classes and approximate
cumulative percent
o Determine frequency of each unique DN value
o Determine cumulative probability for each DN value
o Using cumulative probability of the DN, assign it to output
whose cumulative percent most closely matches the
cumulative probability
 Gaussian
 Transforms histogram to a Gaussian (bell-shaped) distribution
 High and low ends of the distribution tend to be strongly enhanced
 Intermediate DN values change relatively little
 Log Stretch
 Maximize contrast in dark part of the histogram
 Inverse Log Stretch
 Maximize contrast in brightest part of the histogram
Multi Spectral Systems
Terminology
 Pixel
 One array of numbers per band
 Abbreviation of “picture element”
 Smallest 2-D unit of an image
 Value
o Brightness value” (BV)
o “Digital Number” (DN)
 Location (x,y)
o Column # (x)
o Row # (y)
 Quantization
 Conversion of electrical signal to digital number
 Radiometric resolution of signal
 Typically in range of 8-12 bits
 8-bit data range from 0-255
 12-bit data range from 0-1023
ASPRS Guide to Land Imaging Systems
 Civil land imaging satellites- resolution >= 59 meters
 Optical large number in orbit, over 50 countries
o Two major resolution groups
 20 high resolution systems (0.5 to 1.8 meter)
 24 mid resolution systems (2.0 to 39 meter)
 Radar about 10 in orbit, 18 countries
 Coverage capabilities
 Hi-res swaths- 8 to 28 km
 Mid-res swaths – 70 to 185 km
 A few privately funded systems in orbit
 US and Israeli
o Hi-res military market
o RapidEye of Germany
 Broad area of applications with a 5 micro-satellite
constellation
 Planned European satellites
 “Dual Purpose”-data will serve both military and civil users.
o Tasking will be shared has not been revealed
Popular Data in the U.S.
 Landsat
 Multi-spectral Scanner (MSS)
 Thematic Mapper (TM)
 Enhanced Thematic Mapper (ETM+)
 Operational Land Imager (OLI)
 SPOT
 ASTER- Advanced Spaceborn Thermal Emission Radiometer
 MODIS-MODerate resolution Imaging Sensor
 Quickbird
 Ikonos
Landsat System
 Landsat Program
 Longest running program
 Multi-spectral image data from space
 Focus on land resource applications
 Satellite
 Weight 5000 lbs.
 Length 14 ft.
 Width 9 ft.
 Orbit
 Sun-synchronous, near polar
 ~705 km altitude
 9:42 equator crossing
o pictures are taken early morning because clouds
 Each orbit ~99 minutes
 14 orbits per day
 Repeat coverage-every 16 days
 Landsat Worldwide Reference System
 Location over earth catalogued by WRS path/row
 Each scene covers 185 km (wide) by 170 km (long)
 Sensors
 Return Beam Vidicom (RBV): Landsat 1-3
 Multispectrla Scanner (MSS): Landsat 1-5
 Thematic Mapper (TM): Landsat 4&5
 Enhanced Thematic Mapper + (ETM+): Landsat 6&7
 OLI-Operational Land Imager – Landsat 8
LDCM Landsat Data Continuity mission- launched February 11,
2013
Landsat- Multispectral Scanner (MSS)
 Longest collection of data by Landsat (1-5)
 Optical-mechanical scanning system
 Oscillating mirror scans across scene with width of 185km
 Light from scene is reflected through focus optics
 Filters separate beam into separate wavelength bands
 Spectral/Radiometric Properties
 Spectral resolution
o 0.5-0.6μm green
o 0.6-0.7μm red
o 0.7-0.8μm near infrared
o 0.8-1.1μm near infrared
 Radiometric resolution
o Landsat 1-3: 6 bit
o Landsat 4-5: 8 bit
 IFOV is 79 meters
 Pixel size: 56 x 79 m
 Pixel size and spatial resolution are not the same
 Sapling time
 Temporal Resolution
 Landsat 1-3: 1-18 days
 Land 4-5: 16 days
 Routine operation ceased in 1992
Landsat- Thematic mapper (TM)
 Introduced on Landsat 4 (1982)
 Improvement over MSS:
 Spectral-extended spectral region- visible, NIR, mid-IR and thermal
o 7 bands vs. 5
 Spatial- 30m vs. 79m (120m for thermal)
 Radiometric- 8-bit vs. 6-bit
 Temporal- 16 day (Landsat 1-3, 18 day)
 Landsat TM 4 & 5
 Sensor characteristics
 Type of detector
 (Different material for different wavelengths)
 VIS-NIR
o Silicon
 MIR
o Indium antimonide
 TIR
o Mercury-cadium-telluride
 Spectral sensitivity
 1. 0.45-0.52 μm |Blue| water penetration, cultural features, smoke
plumes, atmospheric haze
 2. 0.52-0.63 μm |Green| Measure peak green reflectance, cultural
features
 3. 0.63-0.69 μm |Red| Chlorophyll absorption region, plant species
differentiation, reduced atmospheric effects
 4. 0.76-0.9 μm |NIR| Vegetation type, biomass, land/water
boundary discrimination
 5. 1.55-1.75 μm |MIR| moisture content, now vs. cloud
discrimination
 7. 2.08-2.35 μm |MIR| Vegetation moisture content, rocks and
mineral types
 6. 10.4-12.5 μm |TIR| Thermal mapping and vegetation stress, soil
moisture
Landsat 7- Enhanced Thematic Mapper +
 ETM+ introduced on Landsat 7 (1999)
 Similar to Landsat TM 4 & 5
 Optical bands (1-5 & 7)
 30m spatial resolution (bands 1-5 & 7)
 Temporal resolution (16 day)
 Radiometric resolution – 8 bit
 Thermal band 6 improved from 120m to 60m
 New 15m Panchromatic band added (band 8)
 0.52 to 0.9 μm
 Scan line corrector failed in May 31, 2003
Landsat 8- OLI
 Launched February 2013
 Operational Land Imager

 12 Bit Radiometric Resolution


 Pushbroom vs. Whiskbroom sensor
Sources: Landsat Data
 Glovis – USGS
 All of the data are now free
 Free
 AmericaView
 Universities – U. Maryland, Global Landcover Facility
SPOT Satellite System
 Satellite Pour I’Observation de la Terre (SPOT)
 French Space Agency and other European countries
History of SPOT

 SPOT images are cover smaller area


SPOT Sensors
 SPOT 1 -3
 Two high resolution Visible (HRV)
 SPOT 4 & 5
 Two HRVIR sensors
o Added SWIR band
 Vegetation sensor
 HRV sensor
 Panchromatic (PAN)
o Spatial resolution: 10m
o Spectral resolution: 0.51 – 0.73 μm
 Multi-spectral (XS)
o Spatial resolution: 20m
o Spectral resolution
 0.50-0.59 μm (green)
 0.61-0.68 μm (red)
 0.79-0.89 μm (NIR)
 1.58-1.75 μm (SWIR band added to Spot 4)
 Makes it HRVIR
 Uses linear arrays
 Two CCD arrays
 Linearly arranged
 Improvements
 No mirror that scans back and forth
 Allows for Longer dwell time
 Pixel size is uniform across the swath width
 SPOT 6 and SPOT 7
 Image product resolution:
 Panchromatic: 1.5 m
 Color merge: 1.5 m
 Multispectral: 8 m
 Spectral bands with simultaneous panchromatic and multispectral
acquisitions:
 Panchromatic (450-745 nm)
 Blue (450-525 nm)
 Green (530-590 nm)
 Red (625-695 nm)
 Near-Infrared (76-890 nm)
 Footprint: 60 km x 60 km
SPOT- Pointabiltiy
 Increased imaging frequency
 Improve temporal resolution and see in stereoscopic
 Stereoscopic imaging
Source: SPOT data
 SPOT Image Corporation
 Other commercial vendors
 Typical cost: $2000-$7000
 Spatial and spectral resolutions
 New vs. archived data
 Has dropped in recent years - $1200
Hyper-Spectral and High Spatial Resolution Sensors
Hyperspectral
 The simultaneous acquisition of images in many relatively narrow,
contiguous and/or non-contiguous spectral bands throughout the UV,
visible and IR portions of the EMR spectrum
 Advantages
1. Acquisition of data in hundreds of spectral bands simultaneously
2. Many surface materials have diagnostic absorption features
 Ex: AVRIS (Airborne Visible InfRared Imaging Spectrometer)
Resolution Attributes
 Spectral Resolution (hundreds of bands)
 10nm width
 Spatial Resoultion
 20m- 20km flying height
o 5m
 Radiometric Resolution (12 bits)
Hyperspectral Instruments
 Satellite Based
 Hyperion- On board EO-1
 Launched Nov., 2001
o Spatial
 30m
o Spectral
 220 bands
 10 nm wide
o Temporal
 16 day
Earth Observing System (EOS)
 Large array of instruments to monitor the earth, ocean, and atmosphere
 Two satellites
 Terra and Aqua-different overpass times
o Many Sensors
 MODIS, ASTER, MISR, CERES, MOPPIT
 Free data
 Monitor and map a wide array of the environment
 Develop products that people can use
MODIS
 Moderate Resolution Imaging Spectrometer (MODIS)
 On board NASA’s Terra and Aqua satellites
 36 bands- visible- Longwave Infrared
 12-bit
 Spatial resolution: 250m-1000m
 Daily coverage of the entire earth
 Advanced Spaceborne Thermal Emission and Reflection Radiometer
(ASTER) Onboard Terra
 Detailed image
 RGB= 1,4,3
o NIR
High Spatial Resolution Sensors
 Acquisition of data at high spatial resolution (<10 meters)
 Acquires data across few spectral bands
 This is changing
 Trade off between spatial and spectral resolutions is an important concept
Generic Resolution Attributes
 Spatial Resolution: (.46m – 5 m)
 Trade off=small swath (8 km – 20 km)
 Radiometric Resolution: (up to 12-bits)
 Spectral Resolution: currently = VIS & NIR
High Spatial Resolution Instruments
 Airborne Sensors
High Spatial Resolution Sensor (ADAR)
 Airborne Data Acquisition and Registration
 Spatial resolution: 0.6m – 1m (depending on flying height)
 Swath ~ 1 km x 1 km
 Spectral resolution: Blue, Green, Red, and NIR
 Radiometric resolution: 8 bit
High Spatial Resolution Sensor
 Spaceborne Sensor: Ikonos
 Spectral/Spatial resolutions:
o 4 Multispectral bands (4m)
 Blue, green, red, NIR
o 1 panchromatic band (1m)
o swath 11km
 Temporal resolution: < 3days
 Radiometric resolution: 11-bits
 Cost: $25 per square km2
o (ETM is free)
QuickBird
 Spatial/Spectral Resolution
 0.6m, Panchromatic
 2.4m , multi-spectral (B,G,R, NIR)
 16.5 km Swath Width
 Radiometric
 11 bit
 Temporal
 1-3.5 day Revisit Frequency
o Pointable
Others
 Formosat 2
 Taiwan’s National Space Organization
 Cosmos KVR-1000
 2m pan- Russian
 Eros
 Earth Resources Observation Satellite, Israel, May of 2000
 Indian Remote Sensing (IRS)-1995
 5.8m pan, 23m vis-NIR
 Kompsat 2
 KOrea MultiPurpose SATellite
 Panchromatic: 1m
 Multispectral (B,G,R,NIR): 4m
 Worldview 1,2,3
 Geoeye 1,2
 Rapid Eye
 Pleiades-constellation – 2 satellites
Worldview 2 digital globe
 Launched OCtobre 8th 2009
 1.8m Multispectral
 8 bands
 .46m panchromatic
 11 bit
 16.4 km swatch width
 1.1 day repeat
WorldView-3- Digital Globe
 Launched August 13, 2014
 Panchromatic: 450-800nm – 0.31m
 8 Multispectral: 400 nm – 1040nm – 1.24 m
 red, red edge, coastal, blue, green, yellow, near-IR1 and near IR2
 8 SWIR: 1195 nm – 2365 nm – 3.70 m
 12 CAVIS Bands: 405 nm – 2245 nm – 30.0m
 desert couds, aerosol-1, aerosol-2, green, water-1, water-2, water-
3, NDVI-SWIR, cirrus, snow
 Radiometric
 11-bits per pixel
 Swath Width
 13.2 km
 Temporal
 Less than one day

Image Classification Focus on Supervised
MODIS land Cover Mapper
 Purpose- partitioning image data into thematic categories
 Land cover for the entire world
 Land cover type (i.e. vegetation, urban)
o Generally used to make thematic maps
Image Classification
 Process
1. Identify classes of interests from classification system
o What type of classes do you map?
2. Acquire appropriate data- imagery, ancillary, ground reference
3. Preprocess data-radiometric correction, geometric registration,
transform bands
4. Select classification logic/algorithm
5. Extract training sites
6. Select bands of transformation of bands to classify
7. Extract final training sites
8. Classify image
9. Assess accuracy of classified map
10. Distribute results
Pattern Recognition
 Image classification based on spectral pattern recognition
 Basis of most image classification
 Uses DN values in different bands
o Spectral signature concept
 Based on spectral signatures of the object
Types of Classification
 Hard classification (i.e. one class- every pixel can only be in one class)
 Supervised
o Identification and location of land cover types are known a
priori (beforehand)
 Unsupervised
o Land cover types are not known a priori and computer
generates spectral similar cluster
o The computer finds similar spectral values and the person
labels this is water, vegetation
 The human value comes in later
 i.e.: Building, road, or one of these things.
o You have to put one and one thing in it
 Fuzzy Classification
 Takes into account the heterogeneous/imprecise nature of real
world
o Each pixel can have a percentage value
 Membership in a category is associated with probability
 i.e.: 30% grass
Supervised Classification Process
 Supervised
1. Select Training sites for Each Information Class
 Class: Water, sand, forest, urban, corn, hay
2. Calculate Statistics for Each Training Site
3. Evaluate “Seperability” of training sites and band combinations
4. Apply training sites in decision rule to entire image
5. Evaluate output
Select Training sites for Each Information Class
 Areas with the lands cover type of interests
Feature extraction/Training sets
 Delineating area
 Interactive digitizing, “area of interests”
 Select seed pixel and region grow
 Use automated clustering
 Summarize pixels within training set
 Distribution
 Look at the mean, standard deviation, variance, min/max
o When the mean, median, and mode lines up, it is a normal
distribution
 Variance-covariance matrix
o Variance varies between the different bands
 The more bands you have, the more you are able to
discriminate as different
Assessment of Training Sites
 Evaluate “Seperability” of training sites and band combinations
 Histograms
 Coincident spectral plots
o How things vary in mean and in different bands
 2D scatter plots
 Graphing spectral response for two bands
 Elliptical plots
 Feature Space 2D Plots
o With means and rectangular distributions
 If things overlap, it will be difficult
 Quantitative
1. Transformed Divergence- range 0-200
2. Jeffries-Matusita (JM)-range 0-1414
o Covariance weighted distance matrix between category
means
o Higher divergence equates to greater “statistical distance”
between means
 Interactive preliminary classification
 Representative subscene classification
 Refinement tends to iterative, trial and error
Decision Rules
 Once training sites have been determined and evaluated, classification
then proceeds.
 Actual classification, that is assignment of a pixel to a class in the entire
image is based on a decision rule
1. Minimum distance to means
2. Parallelpiped
3. Gaussian Maximum Likelihood
o Bayesian
1. Minimum Distance to Means
 Process
 Generate mean value for each training site in each band
o Pythagorean theorem
 Distance between a pixel spectral values and each of the category
means is measured
 Pixel is assigned based on shortest distance
 Threshold can be used to ensure pixels are within some distance of
a mean training site
 Advantages/Disadvantages
 Computationally simple and efficient
 Disadvantages-insensitive to different degrees of variance
 Not widely used if spectral classes are close to one another and
have high variance
2.Parallelpiped
 Pixel is classified as to whether it falls within specified range
 Defined as rectangles in feature space
 Based on entire distribution (min-max) or plus/minus 2 standard
deviations
 Advantages/Disadvantages
 Includes sensitivity to category variance
 Problems when signatures overlap
 Alleviate with stepped boundaries that parallel the distribution
 Fast and computationally simple
3.Gaussian Maximum Likelihood
 Evaluates both variance and covariance of category spectra response
 Assume normal distribution of pixels in training site which is described by
mean vector and covariance matrix
 Parametric statistics
o Assumes a normal distribution of mean, median, and mode
 Using these statistics, probability that a given pixel is member of any one
class can be computed
 Calculate the probability
 Pixel assigned to most likely class (highest probability)
4.Bayesian
 Extension of maximum likelihood
 Determine anticipated likelihood of occurrence for each class in scene (a
priori probability)
 Weight associated with cost of misclassifying a pixel
 Generally assumption is equal probability of all classes unless other info is
available
 Maximum likelihood methods are computationally intensive
 Lookup tables
 Reduce dimensionality
 Stratification
Supervised Classification
 Supervised- Training site issues
 Representative
 Variability
 Signature extension
 Normal/multimodal distribution
 Separability
o Graphical assessment
o Quantitative assessment
 Decision Rule
 Parallelepiped (Non-Parametric)
 Minimum Distance
 Maximum Likelihood (parametric)
 Output is either in a class of “unknown”
 Distance image
o How far away the pixel is from the place it was assigned to
o How much confidence do you have?
 Is it correct?
Unsupervised Classification
 Similar to supervised classification except the human element occurs later
in the process
 You don’t have the specific classes beforehand
 Similar spectral signatures
 Unsupervised classification steps
 Clustering
 Labeling
 Evaluate output
 Clustering algorithms
 ISODATA
 K-Means
 RGB
ISODATA
 Iterative Self-Organizing Data Analysis Technique
 First iteration arbitrarily determines means of N clusters
 After each iteration, a new mean for each cluster is determined
based on actual spectral values
 Used as means for defining clusters in next iteration
 Process continues until little change between iterations
 Advantages
 Not geographically biased to top or bottom of file
 Successful at finding clusters inherent in the data
 Disadvantages
 Time consuming
 Does not account for pixel spatial homogeneity
Unsupervised Classification
 ISODATA or other clustering algorithm clusters are labeled as a class
 Evaluate the signatures
o Histograms, Feature Space, TD, JM
 Input Signatures in Classification Decision Rule (PP, Minimum
Distance, Maximum Likelihood)
o Outputs a classified image
o Thresholding
 Overlay the classified image pixels on the original image
 If areas cannot be labeled a class
o A mask is created and the non-classified areas are run again
in the clustering algorithm
 Iterative Process
 Outputs are then recoded into the final map
Other Classification Techniques
 Remotely sensed information is imprecise:
 Boundaries between phenomenon are fuzzy
 Heterogeneity within classes
 Non-normally distributed data
 Fuzzy classification
 Pixels are assigned “membership grade” vector
o Describes how close the pixel value is to the mean of training
class vectors
 Proportion of component classes in a pixel can be estimated
 Spectral Mixture modeling
 For every pixel, it would calculate the percentage for tree, soil, and
shadow
 Neural networks
 Regression and Decision Trees
 Segmentation/region-growing
 Tries to find the things you want and classify at segment scale
Accuracy assessment
 Error matrix
 For selected sample of pixels compare “actual” category with
classified category
o Columns=reference data
o Rows=classification generated form remotely sensed data
 Reference data – “ground truth”
 Should not be used for training sites
 Going to the site is expensive
 Errors
 Commission – inclusion
 Omission – exclusion
 Sampling strategy
 Stratify the sample
 Sample size
Evaluation of Error Matrices
 Overall Accuracy
 Total correct pixels divided by total number of pixels in the error
matrix
 Producer’s Accuracy-omission error
 Probability of a reference pixel being correctly classified
 User’s Accuracy-commission error
 Probability that a pixel classified on the map actually represents the
category on the ground
 Kappa
 Analysis is a discrete multivariate technique of use in accuracy
assessment
o Takes into account overall accuracy and user’s and producer’s
accuracy
 Khat statistic- measure of agreement or accuracy
Image Restoration: Radiometric Correction
Image Restoration
 Remove/correct imagery for internally and externally caused distortions
and degradations
 Preprocessing to avoid effects of distortions in alter processing
results
 Two major types
1. Radiometric
o Errors in the digital number values
 Internal and external
2. Geometric (x, y coordinates)
o Location
o Distortions in the Image
 Internal geometric errors
 Nonsystematic
Internal- Radiometric Errors
 Random bad pixels (also called shot noise)
 Line/column drops, striping
o Doesn’t take an image
 Systematic noise
Random Noise
 Non-systematic variation, “bit errors”
 DN values spike much higher/lower than surrounding pixels
Line Drop & Striping Radiometric Corrections
 Line drop-out
 No data recorded or spurious values usually due to a detector
failure
 Line striping and banding
 Data is recorded, but values are higher or lower than surrounding
values creates stripes in the image
o In MSS these occur every 6th line
o In TM occurs every 16 lines
 Fix – average data from row above and below
Patterned Noise Radiometric Corrections
 Patterned Noise – systematic, regular errors in the data
 Fourier Transform or Principal Components Analysis (PCA) may be
used to isolate and remove
 Fourier Transform
 Mathematical technique for separating an image into its various
spatial frequency components
o Sinusoidal curves at varying frequencies
 Low frequencies in the center
 Higher Frequencies outside
 Lines are stationary leading to bright points
External Errors – Radiometric Processing
 Atmospheric Effects
 Illumination
 Transmittance
 Path radiance
 Attenuation
 Adjacency
 Methods
 Direct/ physically based models
 Normalization
 Terrain Correction
Atmospheric Interaction
 Trying to get the actual energy, so you model the atmosphere
1. Direct illumination
2. Scattered in atmosphere
3. Scattered before illumination
4. Adjacency effect
5. Scattered into IFOV
Illumination
 Varies a function of cosine solar zenith angle
 Solar Zenith Angle is function of time of day and year
Scattering
 Rayleigh and Mie scattering
 Different particle size affects the wavelength
Atmospheric Corrections
 When to correct:
 Accurate measurement of reflectance/radiance
 Comparisons across time/space
 Image classification – spectral library based techniques
 Why NOT?
 No serious atmospheric/radiometric contamination AND
 For visual interpretation
 Relative measurement
 Classification – traditional classification algorithms
Atmospheric Correction Methods
 Diversity of methods
 Two major categories
 Direct (I.e., Radiative transfer based) – try to account for the
atmosphere …
 Indirect (Normalization)- relative correction
 Hybrid methods
Atmospheric Corrections – Direct Models
 Use physically based model to account for and remove atmospheric
effects
 This changes the DN values
 Simplified
o Histogram Shift
o Dark object subtraction and improvements
Atmospheric Correction – Direct Models
 Radiative Transfer Based
 RTC Models – Lowtran, Modtran, 6S, Mosart
 Characterization of the Atmosphere
o Absorption
o Scattering
o Angles, Path Length
 Correction Models
Requirements for Radiative Transfer Based
 Atmospheric Inputs
 Measurements (ground-based, radiosonde)
 Atmospheric model
o Average profile conditions (pressure, temperature, water
vapor, etc.)
 Scene-derived estimates
o Aerosol concentrations
o Water vapor
Atmospheric Correction- Indirect Methods
 Atmospheric effects are not isolated
 Computationally simple, no outside data needed
 Correction is relative
 Image to image- Time Series
o Histogram matching
o Regression
 Whole scene
 Invariant Features
Pseudo-invariant Feature Method
 Pseudo- fake
 Invariant- unchanging
 Feature- object
 Find different things in your image that’s not suppose to change
over time.
o Use them as reference points
o
Geometric Corrections
Image Restoration Continued…
Terrain Correction
 Used for mountainous areas
 Transforms
 Band Ratios
 Digital Elevation Model-based
 Adjusts for relative orientation of the pixel towards the sun’s actual
position- the angle of incidence
o DEM- you can calculate slope, so you can calculate hill shape
 Difficult to do because DEM is different in every
resolution
 Cosine
 Minnaert
 C-factor
Terrain Correction
 Band Ratio
 Ratios stays about the same and the amount light has changed
Utility
 ‘Truer’ signals
 trying to recreate the signal so it looks prettier and it give you a
better classification by using a better classification site.
 improves visual appearance
 improves classification accuracy
 no ‘false’ information classes

Geometric Corrections
 where does the image show up?
Image Restoration
 Remove/correct imagery for internally and externally caused distortions
and degradations
 Preprocessing to avoid effects of distortions in later processing
results
 Two major types
 Radiometric
o Errors in the digital number values
 Internal and external
 Geometric (x,y coordinates)
o Distortions in the image
 Internal geometric errors
 Nonsystematic
Geometric Corrections
 Process of Rectification, georeferencing, and registration
 Removal of systematic distortions
o Requires model of systematic distortions
o Requires platform ephemeris data (altitude and attitude)
 Usually done by vendors
 After Systematic Corrections TM data may be off from 5
to 10 pixels due to topography
o Still not planimetric
 Removal of unsystematic distortions
 Systematic Distortions
 Earth rotation
 Scan skew
 Platform velocity
 Perspective
 In general systematic distortions:
 Corrected using data from platform ephemeris and knowledge of
internal sensor distortion
o Information from the sensor itself
 Commercial vendors
Earth Rotation
 Fixed orbits
 Earth rotates underneath
 Skews the geometry of the imagery collected
o Worse at the poles
 Sensor time to scan/dwell and during that time earth rotates
 End of frame, the point is further west than when imaging began
 Bottom of images must be offset to west
Systematic Geometric Distortions
 Scan Time Skew
 Time to scan a swath, but sensor is moving forward
o Scan line is advanced compared to beginning of the scan
Unsystematic Distortions
 Altitude
 Topography
 Attitude
 Aircraft/spacecraft movement
o Images get distorted when the aircraft or spacecraft moves
 Requires rectification using ground control points to remove
 Platform Velocity, attitude, or altitude variation
 Velocity and altitude
o Affects scale
 Attitude causes distortion
 Compression
 Expansion
Unsystematic Corrections
 Image to Map rectification
 Ground reference data (GPS or a map)
 Image to Image registration
 Time series
 Corrections of Unsystematic errors
 Steps (i.e., image to map coordinates, image to image)
o Select Ground Control Points (GCP)
 GCP-location on the surface of the Earth (e.g., a road
intersection) that can be identified on the imagery and
located accurately on a map
 Well-defined, spatially small
 Well-distributed across image
 Entire scene
 Number depends on the order of fit used
o Compute and test coordinate transformation
o Generate new output image

Selecting GCP’s
 On the ground
 Have the image in hand
 Differentially Corrected GPS
 Problems/limitations
o Clouds affect GPS
o Expensive to keep people on the ground
o Find the locations (one of the hardest)
 Need the locations to be spread out
Geometric Correction
 Compute and test coordinate transformation
 Order of Fit
o GCP’s matching :
 Where it is actually is and where it thinks it is
 RMSE(Root Mean Square Error)
o Difference between input (source) location of a GCP and
retransformed location for the same GCP
 Typically want less than half a pixel
Linear Transforms (1st Order)
 Create a whole new image
 Create a new grid
Generating New Output Image
 Calculate Pixel Values in New Image
 Intensity Interpolation
1) Nearest Neighbor
2) Bilinear Interpolation
3) Cubic Convolution
1) Nearest Neighbor Resampling
 Retains original values
 Suitable for thematic data
 Easiest
 May create stair-step appearance, non continuous linear features
2) Bilinear Interpolation
 Calculates new pixel value by interpolating brightness values in two
orthogonal directions weighted distance
 Smoother, more spatially accurate
 Used when changing cell size
 Cannot be used with thematic data
3) Similar to Bilinear interpolation but uses the 16 closest pixel values
 Best match to original image statistics
o Better to do just once because they more you do, the you are
messing with the image
 Depending on data may sharpen or smooth noise
 Most computationally intensive
GIS-RS Integration
Introduction
 Objective
 To summarize the main components and functional attributes of
GIS
 Explain how remotely sensed data may be integrated in a GIS or
GIS data may be applied to assist with image processing operations
 GIS
 A system to encode, store, manage, manipulate, transform,
analyze, and display spatial data from the real world for a specific
application
o Very poor with time
 Inter-relationships
 Remote sensing- GIS- Cartography
GIS Components and Functionality
 GIS data capture
 Data entry: digitizing, scanning, GPS data collection, and derived
products (remote sensing)
 Geographic Data
o X,Y location coordinates
o Z- non-location attributes
o 4 types of data
 points, lines, polygons, and surfaces
 Data structure
o Vector
 Cartesian Coordinates
 Topological
o Raster
 Traditional raster (grid)
 Point-pixel
 Not the size of point, but the size of the
pixel
 Line- string of pixels connected together
 Polygon- contiguous group of the same type of
pixels
Two Ways to Visualize Data of the World
 Raster- Grid
 “Pixels”
 a location and value
 satellite images and aerial photos are already in this format
 Vector-line
 Points, lines, and polygons
 “features” (house, lake, etc.)
o Attributes
 Size, type, length, etc.
GIS Raster datasets
 Traditional grid/matrix based
 Interchangeable with remotely sensed data
o Standard image data structure (.GEOTIFF)
 Image with X,Y locations
 Examples
o USGS LULC (Land-use and land cover) @ 1:250,000
o USGS DEM (Digital Elevation Model)
 Conversions between types
 Vector to Raster
o Most straightforward and repeatable
o If a pixel size is small transformation can look very similar
 Bigger the pixels, poorer they look
 Raster to Vector
o Can be problematic
o Grid size
Integration of Remotely Sensed Data and GIS Data
 Data need to be geometrically rectified and registered to a suitable
projection and coordinate system
 Time as an attribute
 Single date, multi-temporal or change image
 Useful in change detection
 Compatible File Formats
 Vector raster
Generating GIS Layers with Remote Sensing
 Generating GIS Layers with RS/IP
 “heads-up digitizing”
o Majority Filter
 Changes the minority color into the majority color
 Can’t have mixed polygons
 Classification results
 Updating GIS layers with RS/IP
 Currency
o Provides coverage over large spatial areas quickly
 Editing
o Use old map and update that with new remotely sensed data
Merging GIS and Image Processing
 GIS data in support of image correction
 Radiometric
o Can be used to find invariant features
 Geometric
o Can use GCP’s from a GIS to register image and GIS layer
 GIS data in support of image classification
 Visual integration via overlay
Incorporating GIS Data into a Classification
 Purpose- to improve classification results
 To aid in cluster busting (i.e., confused classes)
 Types of data
 Elevation
o Slope, aspect
 Geology
 Soils
 Political boundaries
 Vegetation map
Incorporating GIS data in Class
 Possible uses of data-
 Delimiting Study Area (.e., park boundary)
 Potential training sites (i.e., supervised classification)
 Geographical stratification
o Elevation, slope, soil type, vegetation type
 As an additional band
 Reference data set for error analysis
DEM Overlay
 Correct topography
 Differences in height
 Raster (grid)
GIS Operations with Grid based data
 Boolean Model overlay
o Adding numbers together
o Numbers- land cover type
 Hope that something numerically comes out correct
 Area Calculations Overlay
o Add numbers and multiply it by resolutions
 Search Radius Aggregations
GIS Operations with Remotely Sensed Data
 Remotely sensed data processed or analyzed by:
 Attribute Re-coding
o Changing the values of the data (i.e., thematic outputs)
 Changing the number to whatever you want it to be
 For comparing different data
 The number represents a theme or class
 Re-scaling
o Changing the resolution (degrading, MMU)
 MMU- Minimum Mapping Unit
 Smallest thing will represent on a map
 Weighting
o Making values more or less important
 GIS Algebra
o Treats map values a variables that can be transformed or
combined to create new layers
 Focal, zonal, regional operations
 Compute each locations’ new value as a function of the existing
value within a specified distance
 Majority Filers, Clump, Sieve
o Majority Filter
 Majority is used because average classified data does
not mean anything
o Clump
 Looks in all direction of red pixels and groups them
 Called clump 1
 Starts in upper-left hand corner
 Even though it’s the same class (red), it is a different
clump
 This is why you recode back
 It goes to the next row
 Goes from black to white because it starts from the
upper rows
 Better for vector data
o Sieve
 MMU-smallest unit you can map
 They won’t have a classification (0/unclassified)
 Incorporated to the pixels around it
 Overlay Models
 Compare data sets
Biophysical Remote Sensing Vegetation
Biophysical remote Sensing
 Application of physical principles and methods to biological problems
 Estimating or inferring information about:
 Environmental structures (e.g., plants, landforms);
 Processes (e.g. evapotranspiration, plant-growth, CO2 flux)
o Related to photosynthesis
 9 Fundamental variables
o (x,y) location, topographic-bathymetric (x,y,z) elevation,
color spectral signature of features, soil moisture, surface
temperature, surface roughness, vegetation – chlorophyll
absorption (photosynthesis), biomass (how much vegetation),
and moisture content
Spectral Signature of Vegetation
 70% of the Earth’s land surface is covered by vegetation
 sunlight strikes the plant leaves chlorophyll, strongly absorbs visible
light (from 0.4 to 0.7 m) and is used in photosynthesis
 cell structure of the leaves strongly reflects near-infrared light
(from 0.7 to 1.1 m).
 The more leaves a plant has, the more these wavelengths of light
are affected, respectively.
 Use this information for modeling
 Yellow and red indicates stress from vegetation. Red is more stressed
than yellow and brown means dead.
Using this Information
 Variations in Red absorption and NIR reflectance
 Indicator of health of vegetation
 Use this information to characterize vegetation over time and space
 Spectral vegetation indices
o Dimensionless, radiometric measures that function as
indicators of relative abundance and activity of green
vegetation
Spectral Vegetation Indices
 Reduces the spectral signature to a single, quantitative, value
 Generally based on red-NIR bands and differences in response
 Most ratio types are related, (same information can be derived)
 Simple Ratio (NIR/RED)
 Use NIR and Red because it is higher in the NIR and absorbing in
the red
 High value => lots of vegetation
 Very low Value => water
o Absorbs NIR
 Varies with
o Types of vegetation
o Amount of vegetation
o Radiometric resolution
 NDVI mostly widely used
 (NIR-Red)/(NIR+Red)
 allows you to have a standardized value that doesn’t matter about
radiometric resolution
 > 0.1 indicates vegetation & < 0.1 indicates no vegetation
o It has to be positive because NDVI would indicate that red is
greater than NIR.
 Black and white because it just 1 band
 Others
Relationship of SVI’s to Biophysical Parameters
 Relative measure of abundance, health, stress
 Veg/no veg masks
 Percent cover
 Empirically related to biophysical variables:
 Biomass
 Leaf-area-index (total leaf area, one side/unit ground area)
o All leaves that project into a column of space about an area
on the ground are measured
 PAR-Photosynthetically Active Radiation
MODIS/Terra Leaf Area Index
 Not very detailed but takes images everyday
 Max range
o 0-10
 Related to total biomass of vegetation
 Influenced by the amount of reflecting soil between plaints
Using Biophysical Data
 Must be preprocessed
 Geometric and Radiometrically corrected
 Continuous surfaces
 Everything thing has temperatures, so it is everywhere
 Radiance/Reflectance values used directly as input to biophysical models
 Multiple scales (leaf, plant, forest)
Biophysical Applications
 Global change issues
 Landscape pattern and process
 Distribution and areal extent of terrestrial ecosystems
 Measure biomass density of terrestrial ecosystems
 Vegetation change
 Measure change in distribution, abundance and diversity of
vegetation
 Biogeochemical cycling processes
 Cycling of carbon
 Quantify knowledge of production and decomposition
 *Calculate GPP (Gross Primary Production)*
Temporal Characteristics of Vegetation
 Timing is very important when investigating vegetation
 Phenological (growth) cycle
 Crop cycles
o Different crops grow at different times
Seasonal
 AVHRR NDVI composites
 More generally, seasonal change appears each year with the “greening”
 The leafing of trees results in whole regions becoming dominated by
active vegetation
Change Detection Mapping
Multi-Temporal Analysis
 Time can provide additional information
 Change detection- why useful?
 Allows you to go back in time
 Cities, agriculture, forest, water
 Generally refers to change in land cover over time
 Use it to monitor crops throughout the year
 Can be applied to other phenomena
Steps- Change Detection
1. Define study area
2. Determine temporal scale for change
3. Select appropriate classification system
4. Minimize effect of environmental considerations
5. Acquire image and ancillary data
6. Preprocess data- geometric and radiometric registration
 Important because images have to be similar
 Images aren’t lined up, they show up twice
7. Select change detection algorithm
8. Compute area and type of change
9. Assess accuracy
Change Detection Considerations
 Step 3: Select classification scheme
 Classes compatible with remote sensed data
 Standardized scheme
 Example: USGS Anderson Land Use/land Cover
o USGS – Level 1 Categories
 Suitable for use with coarse resolution satellite imagery-
MODIS, AVHRR
 Step 4: Resolution Characteristics
 Should be consistent across all dates of imagery
o Want to maintain resolutions
o Change detection measures changes on the ground, they are
not errors in the data
 Temporal
o Anniversary date imagery
 Limits effects of illumination differences
 Limits effects of seasonal and phonological differences
 Spatial
o Same pixel size
o Accurate spatial registration (RMSE <=0.5 pixel)
 Spectral
o Same bands or best approximation
 Radiometric
o Same radiometric resolution
 Step 5: Minimize environmental impacts:
 Atmosphere:
o Cloud cover
o Haze, thin clouds, humidity can alter spectral signatures
o Anniversary date imagery can minimize seasonal weather
variation
 Soil Moisture
o Conditions should be same for both dates
o May be necessary to review rainfall levels, especially prior to
collection
o Imagery may be stratified to adjust for localized effects
 Phenology
o Terrestrial and aquatic ecosystems
o Man-made development cycles
 Urban-Suburban
o States: undeveloped -> landscaping
o Some classes may be spectrally similar
Change Detection Algorithms
 Band overlay using images from 2 dates
 Place each image in different color plane of image display
 Provides a visual display of changes
o Find areas that have changed
 Limitations
o Not quantitative
o No “from-to” change class information
 Good for calculating the area of change
 Multi-date Composite Image
 Rectify images
 Create single data set (composite)
 Detect change
o Classification of all bands
o Areas of change will form their own spectral classes
 PCA (Principal Component Analysis)
 Regression Line
 1 less than total input bands
 orthogonal to the previous one
 May be difficult to label change classes
 Can provide quantitative values of area changed
Change Detection Algorithms –Image Algebra
 Band Ratioing
 Ratio same spectral band for 2 different image dates
o Value near one, high or low?
 Smaller than one or greater than one means change
 Image Differencing
 Subtract same spectral band for 2 different dates of images
o Values near zero, high or low?
 Both techniques require setting change/no change thresholds
 Doesn’t provide information on type of change
Change Detection Algorithms
 Post- classification comparison
 Classify images from both dates
 Change is determined from classified images
o Calculate number of pixels that changed and get the area of
change
 Provides “from-to” information
 Dependent on accuracy of classification and any errors will impact
change detection
o Both pre and post
 Ancillary Data as Source for Date 1
 Use existing land cover data in place of remotely sensed image for
one date
 Classify image for second date and compare
 Depends on quality of classification and ancillary data
 Provides “from-to” classification information
 Manual on-screen digitizing
 Use standard photo interpretation techniques
 Use linked images
 Analyst digitizes changes on screen
 Ex: open source mapping

Right memory Insertion
 Qualify the area change
 Values that had change will come out in red or blue and values that
had no change will come out in purple
 Gives you visual area of change
 Accuracy of all other things
 “from-to” and quantifies the area
Thermal Infrared (TIR) Remote Sensing
Introduction
 Thermal infrared energy is emitted from all objects that have a
temperature greater than absolute zero
 Everything we encounter emits thermal infrared electromagnetic
radiation
 We sense thermal energy primarily through touch
 Reflective infrared (0.7-3.0 m)
 Thermal infrared energy (3.0-14 m)
 Detectors that are sensitive to thermal infrared radiation
 Current detector – mercury-doped germanium (Ge: Hg), indium
antimonide (InSb)
 Cooling
Kinetic Heat, Temperature, Radiant Energy and Radiant Flux
 Kinetic heat
 Energy of particles of molecular matter in random motion
o Measured using a thermometer (in contact with the object)
 Object’s internal kinetic heat is also converted to radiant energy
 Radiant flux ()
 Electromagnetic radiation exiting an object
 Concentration of the amount of radiant flux exiting (emitted from)
an object is its radiant temperature (Trad)
 Generally there is a high, positive correlation between the true kinetic
temperature of an object (Tkin) and the amount of radiant flux radiated
from the object (Trad)
 Radiometers placed some distance from the object to measure its
radiant temperature
 This is the basis of thermal infrared remote sensing
 The relationship is not perfect, Trad always being slightly less than
the Tkin of the object.
o How different is dependence on the emissivity
 Different objects have different emissivity
Emissivity
 Emissivity, , is the ratio between the radiant flux exiting a real-world
selective radiating body (Fr) and a blackbody at the same temperature
(Fb):
 =Fr/Fb
o what it actually does over the black body
 Blackbody is theoretical, so emissivity is from 0 to 1
 All selectively radiating bodies have emissivities ranging from 0 to <1
 Values can fluctuate depending upon the wavelengths of energy
being considered
o Graybody- outputs a constant emissivity that is less than one
at all wavelengths
 Because emissivity is constant it is classed a graybody
 The Emissivity of an object may be influenced by a number factors,
including:
 Color
 Surface roughness
 Moisture content
 Compaction
 Field-of-view
 Wavelength
 Viewing angle
 Metals objects tend to have lower emissivity and have very different
radiometric and kinetic temperatures
Radiation Properties of a Surface
 From the principle of the conservation of energy
 Reflectance
 Absorptance
 Transmittance
 Reflectance + Absorptance + transmittance = 1
Kirchoff’s Radiation Law
 In the infrared portion of the spectrum the spectral emissivity of an
object generally equals its spectral absorptance, i.e. () ~ ()
 “good absorbers are good emitters and good reflectors are poor
emitters”
 Most real-world materials are usually opaque to the thermal radiation
(i.e., no radiant flux exits fro the other side of the object)
 Therefore, we may assume transmittance, ()=0. Substituting emissivity
for absorptance and removing transmittance from the equation yields:
 1=() + ()
Using this simple relationship
 Because the terrain does not lose any incident energy to transmittance
 All of the energy leaving the object must be accounted for by the
inverse relationship between reflectance (()) and emissivity
(()).
 If reflectivity increases then emissivity must decrease. If emissivity
increases then reflectivity must decrease
Relationship of Kinetic and Radiant Temperature
 Using Kirchoff’s Radiation Law and the Stephen Boltzmann Law
 Thermal infrared remote sensing systems generally record the apparent
radiant temperature, Td of the terrain rather than the true kinetic
temperature, Tkin

 Therefore, the radiant temperature of an object recorded by a remote


sensor is related to its true kinetic temperature and emissivity by the
following relationship:

Influence of Emissivity
TIR Radiance At Sensor
 Signal Received Impacted by
 Energy Emitted by the surface
o Amount of energy emitted functions as emissivity and kinetic
temperature
 Energy Reflected off the surface
 Energy Emitted by the atmosphere (path)
 Atmospheric Transmittance
 Sensor field-of-view
Atmospheric Effects
 Gases and Aerosols
 Can emit or absorb energy
 Thermal sensors can be biased as much as 2C when acquired at
altitudes as low as 300m
 Cloud Effects
 Generally blocks thermal radiation
o Large, continuous clouds
o Thin clouds (cirrus)
o Patch clouds (cumulus elements)
Thermal Infrared Atmospheric Windows
 Reflective infrared region from 0.7-3.0 m and the thermal infrared
region from 3-14 m
 Regions that pass energy are called atmospheric windows. Regions that
absorb most of the infrared energy are called absorption bands
Basics of TIR Image Interpretation
 In general, warmer objects appear brighter
 Except meteorological Satellites
 Remember-Skin Temperature
 Contact with the object
 Can sense at night or during the day
 Does not require solar energy
 Generally, night preferred
 Daytime shading
o Topography
o Vegetation
Sensors
 Examples of current satellite sensors aloft
 NOAA AVHRR- June 1979 to present
o LAC-1.1x1.1 km GAC-4.4km
 Local Coverage
Global coverage
o Thermal Bands- 3.55- 3.93m, 10.3-11.3m, 11.5-12.5
 NOAA Geostationary Operational Environmental satellite (GOES)
o 8 km every 30 minutes
o monitors weather
 Moderate Resolutions Image Sensor (MODIS)
o 1km spatial resolution
o 17 bands in the mid to thermal infrared
o Daily coverage
 Advanced Spaceborne Thermal Emission and Reflection
o 30-90 m resolution, Bands 6 MIR, 5 TIR
o surface temperature product
Interpreting TIR Imagery
 Thermal Properties
 Timing of image acquisition is important
o Things warm up and cools down throughout the day;
temperature changes constantly
 Some materials respond to changes in temperature more rapidly
than others
o Water vs. rocks
 Temperature of water doesn’t heat up as easily
 Night time and daytime collection allows for determining different
surface materials
Applications of TIR Image Data
 Marine Science/Climatology
 Sea surface temperature mapping
 Sea surface circulation
 Fisheries
 Fire Mapping
 Smoke plumes consist of ash particles and other combustion
products so fine that they are penetrated by the relatively long TIR
wavelengths
Fractional Vegetation Cover
 Cold background due to permafrost with a warmer vegetation
canopy
 Plant is warmer because it is trying to warm itself
Radar
Radar
 Radar
 Radio Detection and Ranging
o Active microwave
 Operating in radio waves
 Long wavelengths and not much energy, so they have
to create their own energy (active energy)
 Active Systems
 Create their own electromagnetic energy
o Long wavelength (3-25 cm)
o Transmit from the sensor to the terrain
o Interacts with the terrain producing a backscatter of energy
 Interacts with cloud and rain
 Some get affected by cloud and rain, which is
used for weather satellites
 Migration of birds
o Energy received back at the sensor is recorded by the remote
sensing receiver
 All weather capable
RADAR-Wavelengths
 Pulse of electromagnetic radiation sent out by the transmitter through the
antenna is of a specific wavelength and duration (i.e., it has a pulse
length measured in microseconds, sec).
 Much longer than visible, near-infrared, mid-infrared, or thermal infrared
energy usually measured in centimeters rather than micrometers
 Radar wavelength names (e.g., K, Ka, Ku, X, C, S, L, and P)
Wavelength Frequency Relationship
 Longer waves can penetrate different things while short waves get
affected
RADAR Systems
 Doppler Radar
 Weather
 Doppler frequency shifts are a function of the relative velocities of a
sensing system and a reflector
 Plan-Position Indicator Radar (PPI)
 Air traffic control
o WWII airplanes
 Side Looking Airborne Radar (SLAR)
 1950’s
 Fly alongside a country and “see in”
 Synthetic Aperture Radar (SAR)
 Aperture means antennae
 Most common today
 By synthetically creating a long antennae it improves the spatial
resolution
 Uses
 Declassified in the 1970’s
o Still learning about it
 Mapping cloud covered areas- Panama, Amazon
 Ocean-wind, ice, wave Land-minerals, floods, snowmelt
o Water has big impact on radars because it changes wave
properties
o Ice changes density
Side Looking Airborne radar (SLAR)
 Use SLAR as an introduction
 Very similar to Satellite and aircraft SAR systems
 Components of a SLAR system
o Pulse Generator
o Transmitter
o Duplexer
 Sends and receives the energy
o Antenna
o Recover
o Hard Drive for data storage
Radar Nomeclature
 Nadir
 Azimuth flight direction
 Range or look direction
o Where it is sending its beam
 Range (near and far)
o Point closest and farthest
 Depression angle
 Incidence angle
o Angle at which it hits something
o Depends on terrain and changes the incidence angle
 Altitude above-ground-level
Radar Terms
 Radar- a lot of variables and changes throughout the scene
 Azimuth direction
 Line of flight of the aircraft
 Range direction
 Direction of radar illumination that is at right angles to the direction
of the aircraft/spacecraft
o Significant impact on feature interpretation
 Depression Angle
 The angle between the horizontal plane and extending out from the
aircraft fuselage and the electromagnetic pulse of energy from the
antenna to a specific point on the ground
o Near Range depression angle
 Closest point the aircraft
o Far Range depression angle
 Furthest distance that the beam is sent away from the
aircraft
 Incident Angle
 The angle between the radar pulse of electromagnetic energy and a
line perpendicular to the Earth’s surface where it makes contact
o Flat Terrain- incident angle=complement of the depression
angle (sum of both angles equals 90 degrees)
Radar Logic
 Sends out bema of energy
 Short burst microseconds (10-6 seconds)
 Interacts with object
 Receives energy back
 By electronically measuring the return time of signal echoes, the
range, or distance between the transmitter and the objects may be
determined
o SR=ct/2
 SR- slat range (direct distance between transmitter and
object)
 C=speed of light (3x108 m/sec)
 T=time between pulse and transmission
 2 factor both send and receive
 Time to return is used to construct the image-distance
 Strength of signal received back related to microwave reflectivity
Spatial Resolution
 Ground Resolution cell size
 Control two different resolutions
o Range resolution (across track direction)
 Farther the range resolution, the better
o Azimuth Resolution (along track direction)
 Range resolution is proportional to the length of the microwave
pulse
o The shorter the pulse length, the finer the range resolution
o Pulse length is a function of the speed of light (c) multiplied
by the duration of the transmission (t).
o Also a function of the depression angle (varies with how far
away from the aircraft the object is)
Range Resolution
 Range resolution (Rr) at any point between the near and far-range of the
illuminated strip can be computed if the depression angle () of the
sensor at that location and the pulse length () are known. :
 Rr= ( x c)/ 2 cos 
Azimuth Resolution
 Azimuth resolution (Ra) is determined by computing the width of the
terrain strip that is illuminated by the radar beam
 Real aperture active microwave radars produce a lobe-shaped beam
which is narrower in the near-range and spreads out in the far-
range
 The beam width is inversely proportional to antenna length (L).
 Longer the radar antenna, the narrower the beam width and the
higher the azimuth resolution
 The azimuth resolution (Ra) can be calculated by using the wavelength,
antenna length, and the slant range distance:
 Ra=(S*)/L
o Where S is the slant range distance (i.e. distance from the
aircraft)
o L is the length
o And  the wavelength
Influences on the RADAR return
 Terrain
 Polarization Effects
 Surface Roughness Characteristics
 Diffuse Reflector
 Specular Reflector
 Corner Reflector
 Electrical Characteristics
 Water
 Radar System characteristics
 Features
 Vegetation, soils, water, ice, urban
Terrain Surface Influences on Radar Return
 Geometric Characteristics
 Radar relief displacement is caused by changes in elevation
 Higher an object is the closer it is to the radar antenna
o Energy striking this object is received back at the sensor
sooner
o Leads to distortions in the imagery
 Foreshortening
 Layover
 Shadowing
Foreshortening
 All terrain that has a slope inclined toward the radar will appear
compressed or foreshortened relative to slopes inclined away from the
sensor
 Look shorter than it actually is
 Affected by:
 Object height: greater the height of the object-the greater the
foreshortening
 Depression angle: greater the depression angle the greater the
foreshortening
 Location of objects in the across-track range: features in the near-
range portion of the swath are generally foreshortened more than
identical features in the far-range
 In the near-range range, features appear to have steeper slopes than
they actually do
 While, slopes appear shallower than they actually are in the image far-
range
 Shortens and brightens it
Layover
 Image layover is an extreme case of image foreshortening. It occurs
when the incident angle is smaller than the foreslope
 Can’t see it
 This distortion cannot be corrected even when the surface topography is
known
Shadowing
 Radar Shadow
 Backslope is a shadow when its slope angle is steeper than the
depression angle
 Grazing illumination: the backslope equals the depression angle,
 Backslope is just barely illuminated by the incident energy
 The backslope is fully illuminated when it is less than the depression
angle
 The terrain features (e.g., mountains) with identical heights and fore-and
backslopes may be recorded with entirely different shadows, depending
upon where they are in the across-track.
 Feature that casts an extensive shadow in the far-range might have
its backslope completely illuminated in the near-range
 Radar shadows occur only in the across-track dimension. Therefore, the
orientation of shadows in a radar image provides information about the
look direction and the location of the near-and far-range.
Polarized Energy
 Can send and receive in the same or different polarizations (i.e. like
polarized HH, Cross-polarized HV)
 Objects on the ground modify the polarized energy they reflect
 Usually terrain returns energy in the same polarization
 Vegetation-multiple reflections “volume scattering” of incident energy is
depolarized-comes back varied
Polarization
 Cinder Cone
 Basalt flow
 Imaged at the same time with two different polarizations
 Differences are due to the direct reflection of blocks that are large relative
to the wavelength
 Different objects show up differently
Surface Roughness
 Terrain property that most strongly influences the strength of the radar
backscatter
 Surface texture characteristics
 Microscale roughness is usually measured in centimeters (i.e. the
height of stones, size of leaves, or length of branches in a tree
o Rough
o Intermediate
o Smooth
 The area with smooth surface roughness sends back very little
backscatter toward the antenna- dark in the image
 Intermediate surface
 0.17 to 0.96 cm
o Grey in the image
 Rough surface
 H>0.96 cm
o Brighter in the image
 Impacts vary with wavelength and depression angle
Reflectors
 Diffuse Reflector
 Rough surfaces
o Scatter incident energy in all directions and return a
significant portion of the energy to the radar antenna
 Specular Reflector
 Smooth surfaces
o Reflect most of the energy away from the sensor, resulting in
a very low return signal
 Corner Reflector
 Very bright response
o Returns energy incident upon it and the surrounding area
o Buildings, bridges, metal objects
o Used for geometric rectification of radar imagery
 Bright and very obvious in the image
Electrical Characteristics
 Terrain types conducts electricity from the microwave energy from the
radar sensors better
 Complex dielectric constant
 Ability of a material to conduct electrical energy
 Dry surfaces (soil, rock)
o Dielectric constant of 3 to 8 in the microwave portion of the
spectrum
 Water
o Diaelectric constant of approximately 80
 Most significant parameter affecting the materials’
dielectric constant is its moisture content
 Moist soils reflect more radar energy than dry soils
 Bare ground soil moisture
 Vegetation influences
 Ocean high dielectric constant- most energy
reflected
 Penetration
 Moist surfaces only a few centimeters mostly reflected
 Dry surfaces- about equal to the wavelength of the radar system
Vegetation Response
 Plant canopies can be thought of as a seasonally dynamic three
dimensional water bearing structure consisting of foliage components
(leaves) and woody components (stems, trunk, stalk, branches)
 Active microwave can penetrate the canopy to varying depths
 Frequency
 Polarization
 Incident angle
 Received signal varies
 How far the signal penetrates the canopy-crown, trunk
 Depolarized?
 Interacts with the soil surface?
o Surface and canopy
 Types of Active Microwabe Surface and Volume Scattering that Take Place
in a Hypothetical Pine Forest Stand
 Surface Scattering
o Energy interacts with the leaves and stems
 Volume Scattering:
o Scattering from leaves, trunks, branches, etc.
o Depolarizes signal
 Ground surface scattering
o Interactions with the ground surface
o Moisture content
Canopy Penetration
 Longer the wavelength- the greater the penetration into a canopy
 Short wavelengths
 Surface scattering
o X band
 Longer wavelengths
 Surface and volume scattering
o C-Band
 Longest Wavelengths
 Surface, volume, and ground
o L-band
o P-band, greatest penetration
Urban Area Response
 Urban areas are typically light toned in active microwave energy because
of their many corner reflectors
 Cardinal effect
o Reflections from urban areas, often laid out according to
cardinal directions of a compass caused significantly larger
returns when features were illuminated at an angle
orthogonal to their orientation.
Water and Ice Response to Active Microwave Remote Sensing
 Smooth open water areas act as specular reflectors
 Yield no return (i.e. radar signal bounces off)
 Rough water surfaces have varying responses depending on wave
height, surface roughness, and sensor wavelength
 Oil increases specular reflection (oil spill detection)
Digital Elevation Models: Interferometric Radar
 Based on analysis of the phase of the radar signals as received by two
antennas located at different positions in space
 Radar signals returning from a point on the earth’s surface will
travel two different slant ranges to the two antennas
 Difference of the lengths leads to a different portion of the phase to
be captured by each antenna
 If the geometry of the interferometric baseline (i.e. the difference
between A1 and A2) is known with a high degree of accuracy, this
phase difference can be used to compute an interferogram
Digital Elevation Models: Interferometric Radar
 Output is a interferogram
 Displays the phase difference values for each pixel as acquired by
the 2 antenna
 These series of stripes or fringes represent differences in surface
height and sensor position
 Once sensor position is removed, each fringe corresponds to a
particular elevation range
 Using this a DEM is developed
Shuttle Radar Topography Mission (SRTM)
 Joint project-NASA and NIMA-National Imagery and Mapping Agency
(now NGA)
 Single shuttle mission in February 2000-11 days
 Covered 99% of the land area between 60 North and 56 South
 80% of the land area 95% of the people
 Used two antenna
 60 m apart
 primary antenna in the shuttle secondary only received
 collected 12 terabytes of data- 15,000 CD-ROMS
 Spatial Resolution
 30m US (public)
 90m outside
 Horizontal and Vertical Accuracy
 H-20m, V- 16m
 Available for free from CGIAR
Final Exam 09/02/2015
 problem and how you would solve it . what data to collect,
 land cover
 how you should solve the problem using remote sensing
 what data, how you would process them, analyze them and what
products would you have at the end in order to do that
 primary
 data acquisition- which data to choose
 what image processing steps and
 descriptions and how you would do them
 problem solving using remote sensing
 change detection- resolutions of the sensors, years that they were
available, type of change detection you could do, which bands you can
use to go after that, what do you need to measure, classify the image
 can use image algebra
 what classes you need, what classifications, etc.

 ex:
 1985 a bridge was built over a river in bangladesh what is the economic
impact of building that bridge
 background
 use Landsat data
 make sure it is geometrically rectified
 classify - what classes
 change detection over time

Last third of the class


 equations to solve

Sensors, image rectification, change detection, classification,


09/02/2015

You might also like