You are on page 1of 100

GPS SURVEYING

REMOTE SENSING
• In 1973 the U.S. DOD (Department of Defense)
decided to establish, develop, test, acquire, and
deploy a spaceborne Global Positioning System
(GPS), resulting in the NAVSTARGPS
(NAVigation Satellite Timing And Ranging
Global Positioning System).
• It is an all-weather, space based navigation
system development by the U.S. DOD to satisfy
the requirements for the military forces to
accurately determine their position, velocity, and
time in a common reference system, anywhere
on or near the Earth on a continuous basis. ”
• The Global Positioning System (GPS) is a space-
based satellite navigation system that provides
location and time information in all weather
conditions, anywhere on or near the Earth where
there is an clear line of sight to four or more GPS
satellites
• GPS was developed by the US department of defense
to simply accurate navigation
• GPS uses satellite and computer to compute position
any where on earth
• 24 satellite are active in the space , which informed
geographical situation.
• For 3- dimension latitude , longitude and altitude
require at least 4 satellite information accuracy
• Developed by the US DOD
• Provides
– Accurate Navigation
• 10 - 20 m
– Worldwide Coverage
– 24 hour access
– Common Coordinate System
• Designed to replace existing
navigation systems
• Accessible by Civil and Military
• Development costs estimate ~$12 billion
• Annual operating cost ~$400 million
• 3 Segments:
– Space: Satellites
– User: Receivers
– Control: Monitor & Control stations
• Prime Space Segment contractor: Rockwell International (now
Lockheed Martin)
• Coordinate Reference: WGS-84
• Operated by US Air Force Space Command (AFSC)
– Mission control center operations at Schriever (formerly
Falcon) AFB, Colorado Springs
Space Segment

Satellite Constellation

User Segment

Ground
Antennas
Monitor
AFSCN
Stations
Master Control Station

FAIRBANKS

Control Segment
ENGLAND
COLORADO SPRINGS
SOUTH
USNO WASH D.C.
KOREA
VANDENBERG, AFB
CAPE CANAVERAL
Master Control Station (MCS) Advanced Ground Antenna HAWAII Master Control Station BAHRAIN
KWAJALEIN
ASCENSION
Ground Antenna (GA) Monitor Station (MS) ECUADOR DIEGO
GARCIA
TAHITI
National Geospatial-Intelligence Agency (NGA) Tracking Station
SOUTH
ARGENTINA AFRICA
Alternate Master Control Station (AMCS)
NEW ZEALAND
• 12 Hourly orbits
• 24 Satellites – In view for 4-5 hours
– 4 satellites in 6 • Designed to last 7.5 years
Orbital Planes • Different Classifications
inclined at 55 Degrees
– Block 1, 2, 2A, 2R & 2 F
• 20200 Km above the Earth

55
Equato
r
Colorad
o
Springs

Ascensio Kwajalei
Hawai
n n
i
Islands Diego
Master Control Garci
Station
Monitor Station a

Ground
Antenna
• Master Control Station
– Responsible for collecting tracking data from the
monitoring stations and calculating satellite orbits and
clock parameters

• 5 Monitoring Stations
– Responsible for measuring pseudorange data. This
orbital tracking network is used to determine the
broadcast ephemeris and satellite clock modeling
– Ground Control Stations
– Responsible for upload of information to SV‟s
 GPS receivers
 track L1 and/or L2 frequencies
 track C/A code for at least 4 satellites, and demodulation
 Time synchronization (Quartz clocks in the receivers)
 Decrypt satellite data from the code observations (orbit, etc.)
 receive P(Y) code (US Army)
 Compute the pseudo-range to each satellite
 Compute the time offset (receiver clock error)
 Compute the position.
• Topo and Locations
• Mapping
• Monitoring
• Volumes
• Photo control
• Construction Control and Stakeout
• Boundaries
• Seismic Stakeout
• Profiles
• Establishing Portable Control Stations (sharing with Total Stations)
• Agriculture - Slope Staking
• Tracking of people, vehicles
• Plate movements
• Sports (boating, hiking,…)
• Archeology
• Public Transport
• Emergency services
Triangulation is used by
surveyors to map objects and
works on the following principles:
suppose you measure a distance
from one satellite and find it
to be 21,000 kms.

Given that the satellite has


only a certain range or view
of the earth (rather like we can
see only part of the moon surface
at any one time) this narrows
down our possible location to a
radius of 21,000 kms and centred
around the satellite.
We now determine the
distance to a second
satellite and find that to
be 22,600 kms. This
will also have only a
selected footprint on the
earth and the effective
intersection of these two
footprints will narrow
down our position on the
earth.
Taking a measurement
from a third satellite which
might be 23,400 kms
away it narrows our
position down even
farther, to the two points
where the 23,400 km
sphere cuts through the
circle formed by the
intersection of the first two
spheres. Consequently we
can now determine that
we are somewhere on the
circle where these two
spheres intersect.
 Each satellite broadcasts its orbital position in “pseudo
code”

 The receiver on the ground calculates the time the


signal (pseudo code) took to get from the satellite to
ground and turns these time units into distance based
on the speed the light travels at (“pseudorange”)

 Using information from 3 to 4 satellites allows


triangulation to the GPS receivers position.
Observation Equation:

c ti  xi  x P 2  y y 2  z z 2 - c tP


i P i P

Four unknowns – solve for xP, yP, zP, tP


 Traversing
 Triangulation

Base - Rover Methods


•Atmospheric Refraction
1.Ionsospheric Refraction
–Sun‟s activity,
–Ionospheric thickness & proportions/concentration of ionized
particles,
–Season,
–Actual path (i.e. relative position of satellite & receiver) of the
wave i.e. signal
–Pseudo range errors vary from 0 to 15m at zenithal incidence to
as much as 45m for low incidence signals.

2. Tropospheric Refraction
– The delay depends on temperature, pressure, humidity, and
elevation of satellite.
REMOTE SENSING
• Remote Sensing is an interesting and exploratory
science , as it provides images of areas in a fast
and “cost-efficient manner, and attempts to
demonstrate the “what is happening right now” in
a study area.

• Remote: because observation is done at a distance


without physical contact with the object of
interest

• Sensing: Detection of energy, such as light or


another form of electromagnetic energy
• The science and art of obtaining information
about an object, area, or phenomenon through the
analysis of data acquired by a device that is not in
contact with the object, area, or phenomenon
under investigation“.(L&K,1994).

• The term Remote Sensing means the sensing of


the Earth's surface from space by making use of
the properties of electromagnetic waves emitted,
reflected by the sensed objects, for the purpose of
improving natural resources management, land
use and the protection of the environment. (UN,
1999)
• Galileo introduced the telescope to astronomy in 1609
• 1827 - first photograph
• 1858 - first aerial photograph from a hot air balloon
• 1861-1865 - Balloon photography used in American
Civil War
• 1888 – „rocket‟ cameras
• 1903 - pigeon-mounted camera patented
• 1906 - photograph from a kite
• 1914-1945 - Plane mounted Cameras WWI, WWII
• 1908 —First photos from an airplane
• 1909—Dresden International Photographic
Exhibition
• 1914-1918 — World War I
Actual Pigeon Pictures

1858 - First aerial (balloon) photo; picture


of Paris
San Francisco from a kite, 1906
• 1957 - Sputnik-1
• 1960 - 1st meteorological satellite „TIROS-1‟
launched
• 1967 - NASA „Earth Resource Technology
Satellite‟ programme
• 1972 - ERTS (Landsat) 1 launched...
• 1970-1980 : Rapid advances in Digital Image
Processing
• 1986 : SPOT French Earth Observation Satellite
• 1980s : Development of Hyperspectral sensors
• 1990s : Global Remote Sensing system
• Systematic data collection
• Information about three dimensions of real
objects
• Repeatability
• Global coverage
• The only solution sometimes for the otherwise
inaccessible areas
• Multipurpose information
• Energy Source or Illumination (A) - the first
requirement for remote sensing is to have an energy
source which illuminates or provides electromagnetic
energy to the target of interest.
• Radiation and the Atmosphere (B) - as the energy
travels from its source to the target, it will come in
contact with and interact with the atmosphere it
passes through. This interaction may take place a
second time as the energy travels from the target to
the sensor.
• Interaction with the Target (c) - once the energy
makes its way to the target through the atmosphere, it
interacts with the target depending on the properties
of both the target and the radiation.
• Recording of Energy by the Sensor (D) - after the
energy has been scattered by, or emitted from the
target, we require a sensor (remote - not in contact
with the target) to collect and record the
electromagnetic radiation.

• Transmission, Reception, and Processing (E) - the


energy recorded by the sensor has to be transmitted,
often in electronic form, to a receiving and processing
station where the data are processed into an image
(hardcopy and/or digital).
• Interpretation and Analysis (F) - the processed
image is interpreted, visually and/or digitally, to
extract information about the target which was
illuminated.

• Application (G) - the final element of the remote


sensing process is achieved when we apply the
information we have been able to extract from the
imagery about the target in order to better
understand it, reveal some new information, or
assist in solving a particular problem.
• Land Use and Land Cover
• Geologic and soil
• Agriculture
• Forestry
• Water/snow Resources
• Urban and Regional Planning
• Wildlife Ecology
• Archaeological
• Environment Assessment
• Natural Disaster
• Ocean and weather Monitoring
Passive sensors-
Passive system record energy
reflected or emitted by a target
illuminated by sun.
e.g. normal photography, most optical
satellite sensors

Active sensors-
Active system illuminates target with
energy and measure reflection.
e.g. Radar sensors,
Doesn’t employ any external source of energy.
 Measures either reflected radiation from Sun (can be
operated only during daytime) or the emitted radiation from
the surface (day/night operation).
 Suffers from variable illumination conditions of Sun and
influence of atmospheric conditions

Active Remote sensing


Has its own source of energy
 Active sensors emit a controlled beam of energy to the
surface and measure the amount of energy reflected back to
the sensor.
 Controlled illumination signal
 Day/night operation
Passive sensors : collect electromagnetic radiation
in the visible and infra-red part of the spectrum:
• Aerial Photographs
• Low resolution: Landsat,, SPOT, IRS
• High Resolution: Quickbird, IKONOS

Active sensors : generate their own radiation:


• Air-borne RADAR
• Space borne RADAR: ERS 1 / 2, Radarsat
• LiDAR (laser scanner)
 Remote Sensing relies on the measurement of
ElectroMagnetic (EM) energy.
 The most important source of EM energy is the
sun Some sensors detect energy emitted by the
Earth itself or provide their own energy (Radar) .
 All matter reflects, penetrate, observes and emits
of EMR in unique way, which called spectral
characteristics
 Two characteristic of electromagnetic radiation
are particularly important for understanding
remote sensing. These are the wavelength and
frequency.
The wave length is the length of one
wave cycle which can be measured as
the distance between successive Wave
crests.

wave length is usually represented by


the lambda

wave length is measured in meter (M)


or some factors of meters such as
nanometers (nn) micrometers(um) ,
centimeters (cm)

Frequency refers to the number of cycles of waves passing a fixed


point per unit of time. Frequency is normally measured in hertz (Hz),
equivalent to one cycle per second and various multiples of hertz.
• Frequency, ν = c/λ
where, λ = wavelength
c = speed of light
= 3.00 x 108 m/s
Therefore, the two are inversely related to each other. The
shorter the wave length, the higher the frequency. The longer the
wave length, the lower frequency

Energy
Electro-magnetic spectrum (EMS)
• Electro-magnetic spectrum (EMS) is an array
of all EMR , which moves velocity of light,
characteristic of wavelength and frequency of
energy.
• The electromagnetic spectrum ranges from the
shorter wavelengths (including gamma and x-
rays) to the longer wavelengths (including
microwaves and broadcast radio waves)
• There are several regions of the
electromagnetic spectrum which are useful for
remote sensing
‘Optical range’
Cosmic Gamma U-V Infrared Micro-waves TV Radio Electric power
rays rays
Visible spectrum

Green Red

0.3m 0.4 0.5 0.6 0.7m 10.0 15.0


300n m 700n
m 500n m
m
Wavelengt h
• This radiation is just beyond the violet portion of the
visible wavelengths
• Some Earth surface materials, primarily rocks and
minerals, emit visible light when illuminated by UV
radiation.
Visible Range
• The light which our eyes - our "remote sensors" can
detect is part of the visible spectrum.
• The visible wavelengths cover a range from
approximately 0.4 to 0.7 µm. The longest visible
wavelength is red and the shortest is violet .This is the
only portion of the spectrum we can associate with the
concept of colours.
Violet: 0.4 - 0.446 µm, Blue: 0.446 - 0.500 µm
Green: 0.500 - 0.578 µm , Yellow: 0.578 - 0.592 µm
Orange: 0.592 - 0.620 µm , Red: 0.620 - 0.7 µm

► Visible Blue Band (.45-.52 microns)


► Visible Green Band (.52-.60 microns)
► Visible Red Band (.63-.69 microns)
► Panchromatic Bands (.50-.90 microns)
Panchromatic Bands (.50-.90 micrometers)
► Wide range of sensitivity
► Visible to Near IR
► Higher spatial resolution
► Can be combined with other multi-spectral bands.

Visible Blue Band (.45-.52 micrometers)


 Greatest water penetration
 Greatest atmospheric scattering
 Greatest absorption
 Used for :
water depth ,water characteristics
detection of subsurface features ,soil and
vegetation discrimination
► Vegetation discrimination
► Urban Infrastructure
► Less affected by atmospheric scattering
► Sediment and Chlorophyll Concentration
Visible Red Band (.63-.69 micrometers)
Chlorophyll absorption band of healthy green
vegetation.
Vegetation type
Plant condition
Least affected by atmospheric scattering
Less water penetration but good near surface
information ie. Water quality, sediment, and chlorophyll.
 Infrared region approximately 0.7 µm to 100 µm - more
than 100 times as wide as the visible portion!
 Divided two categories based on their radiation properties -
the reflected IR, and the emitted or thermal IR.
 reflected IR region is used for remote sensing purposes in
ways very similar to radiation in the visible portion.
 The reflected IR covers wavelengths from approximately
0.7 µm to 3.0 µm.
 The thermal IR region is quite different than the visible and
reflected IR portions, as this energy is essentially the
radiation that is emitted from the Earth's surface in the form
of heat.
 The thermal IR covers wavelengths from approximately 3.0
µm to 100 µm.
• The portion of the spectrum of more recent interest to remote
sensing is the microwave region from about 1 mm to 1 m.
• This covers the longest wavelengths used for remote sensing.
The shorter wavelengths have properties similar to the thermal
infrared region while the longer wavelengths approach the
wavelengths used for radio broadcasts
• Because of their long wavelengths, compared to the visible
and infrared, microwaves have special properties that are
important for remote sensing.
• Longer wavelength microwave radiation can penetrate though
cloud, fog, haze etc as the longer wavelengths are not
susceptible to atmospheric scattering which affects shorter
optical wavelengths
Interaction with target
• There are three forms of interaction that can
take place when energy strikes.
• Absorption (A): radiation is absorbed into the
target
• Transmission (T): radiation passes through a
target
• Reflection (R): radiation "bounces" off the
target and is redirected

• The proportions of each interaction will


depend on the wavelength of the energy and
the material and condition of the feature.
Su * selective radiation (bluish optical images)

n ** non-selective radiation (white clouds)

scattered
radiation**
R.S.
Instrument
Atmospheric
Atmosphere
interactions
Scattered Atmospheric
Cloud radiation* emission
s
Atmospheric transmitted Reflected Thermal emission
absorption radiation radiation

Earth
Reflection processes Emission processes
• This occurs when the particles of gaseous
molecules present in the atmosphere cause the
EM waves to be redirected from the original
path
• Raleigh scattering : size atmospheric particles
< than the wavelengths of incoming radiation
• Mie scattering : size atmospheric particles ~
than the wavelengths of incoming radiation
• Non-selective scattering : size atmospheric
particles > than the wavelengths of incoming
radiation
• Atmospheric windows is that portion of the
electromagnetic spectrum that can be transmitted
through the atmosphere without any distortion or
absorption. Light in certain wavelength regions
can penetrate the atmosphere well. These regions
are called atmospheric windows.
• Those areas of the spectrum which are not
severely influenced by atmospheric absorption
and thus, are useful to remote sensors, are called
atmospheric windows
Sensor-the device that actually gathers the
remotely sensed data
Platform-the device to which the sensor is
attached
The vehicles or carriers for remote sensing are called the
platform. Based on its altitude above earth surface
Typically platform are satellite and aircraft but they can
also, aero plane, balloons, kites. platform may be
classified as,
1) Ground borne
2) Air borne
3) Space borne
Air born and space
born platform have
been in use in remote
sensing of earth
resources .the ground
based remote sensing
system for earth
resources studies are
mainly used for
collecting the ground
truth or for
laboratory simulation
studies.
Orbits
The path followed by the
satellite is called orbit.
Polar

Inclined Row
The lines joining the
corresponding scene
centers of different
Equatorial
paths are parallel to the
equator are called
„Rows‟.
Altitude
• It is the distance(in Km) from the satellite to the mean
surface level of the earth.
Inclination angle
• The angle (in degrees) between the orbit and the
equator.
Period
• It is the time(in minutes) required to complete one
full orbit. A polar satellite orbiting at an altitude of
800km has a period of 90mins
Repeat Cycle
• It is the time (in days) between two successive
identical orbits
Swath Perigee & Apogee
As satellite revolves around • Perigee: It is the point in the
the Earth, the sensor sees a orbit where an earth satellite
certain portion of the Earth is closest to the earth.
'surface. The area is known • Apogee: It is the point in the
as swath. orbit where an earth satellite
is farthest from the earth.
Descending
node Ascending • The near polar satellites travel north
node ward on one side of the
earth(ascending pass) and towards
South Pole on the second half of the
orbit(descending pass).

Inclination •The ascending pass is on the


angle shadowed side while the descending
Ground
track pass is on the sunlit side.
Equat
or
•Optical sensors image the surface
on a descending pass, while active
sensors and emitted thermal and
microwave radiation can also image
Orbi
the surface on ascending pass
t South
Pole
These satellite are also know as geosynchronous
satellite tor which at an altitude of around 36000 km.
above equator orbiting the earth and makes one
revolution in 24 hours, synchronous with earth
rotation. This platform are covering the same place
and give continuous near hemispheric coverage over
the same area day and night. Its covers is limit to 70°
N to 70° S latitude . these area mainly used for
communication and metrological application.
GOES (U.S.A) , METEOR (U.S.S.R), GMS (Japan)
Uses of Geostationary Orbits
Weather satellites : (GOES, METEOSAT, INSAT)
Communication : Telephone and television relay
satellites
Limited spatial coverage
• There are earth satellite in which the orbital
plane is near polar and altitude (mostly 800-900
km) is such that the satellite passes over all
places on earth having the same latitude twice
in each orbit at the same local time. Through
these satellite the entire globe is covered on
regular basis and gives repetitive coverage on
periodic basis. all the remote sensing resources
satellite may be grouped in this category.

LANDSAT, SPOT, IRS


Resolution is defined as the ability of the system to
render the information at the smallest discretely
separable quantity in term of distance (spatial),
wavelength band of EMR(spectral), time(temporal) and
radiation quantity (radiometric)
Spatial Resolution
Spatial Resolution is the projection of a detector
element or a slit onto the ground . In the other word
scanner‟s spatial resolution is the ground segment
sensed at any instant. It also called ground resolution
element
Ground Resolutions = H*IFOV
The spatial
resolution at
which data are
acquired has two
effects – the
ability to identify
various feature
and quantify their
extent
Spectral Resolutions describes the ability of the
sensor to define fine wavelength intervals i.e.
sampling the spatially segmented image in
different spectral intervals, thereby allowing the
spectral irradiance of the image to be determined.
Landsat TM
imagery

band 1 = blue
band 2 = green
band 3 = red
band 4 = near-ir
band 5 = mid-ir
band 6 = mid-ir
Single band 333

Multi band 453 Multi band 432


• This is the measure of the sensor to differentiate the
smallest change in the spectral reflectance between various
targets. the digitisation is referred to as quantisation and is
expressed as n binary bits. Thus 7 bits digitisation implies
2⁷ or 128 discreet levels(0-127)
Low Radiometric resolution High Radiometric resolution
Temporal Resolution
Temporal resolution is also called as the receptivity of the
satellite. It is the capability of the satellite to image to
exact same area as the same viewing angle at different
period time.
• Remote sensing data (in raw form) as received
from imaging sensors mounted on satellites
contain flaws or deficiencies.
• The correction of deficiencies and removal of
flaws present in the data is termed as pre-
processing.
• Image pre-processing can be classified into three
functional categories:

• Radiometric corrections
• Atmospheric corrections
• Geometric correction
It‟s an error that influences the radiance or
radiometric values of a scene element(pixel).
Change the value (Digital Number, DN) stored in an
image.
System errors - minimized by cosmetic corrections
Atmospheric errors-
minimized by atmospheric
corrections
Geometric errors
It s an error that is related to their spatial location.
change the position of a DN value.
minimized by geometric correct
• Image interpretation is defined as the act of examining
images to identify objects and judge their significance.
An interpreter studies remotely sensed data and attempts
through logical process to detect, identify, measure and
evaluate the significance of environmental and cultural
objects, patterns and spatial relationships. It is an
information extraction process.
Methods:-on hard copy, -on digital image

• Why image interpretation difficult compared to everyday


visual interpretation ?
– loss of sense of depth
– viewing perspective different
– different scale
 Visual

1. Visual image interpretation


on a hardcopy
image/photograph
2. Visual image interpretation on a
digital image
 Digital image
processing
Qualitative

 Quantitative
Shape
 General form/structure of objecs: regular/irregular
Numerous components of the environment can be
identified with reasonable certainty merely by their shape.
This is true of both natural features and man-made objects.
 FUNCTION OF SCALE : Relative size is important
In many cases, the length, breadth, height, area and/or
volume of an object can be significant, whether these are
surface features (e.g. different tree species). The approximate
size of many objects can be judged by comparisons with
familiar features(e.g. roads) in the same scene.
 RELATIVE BRIGHTNESS OR COLOR
We have seen how different objects emit or reflect different
wavelengths and intensities of radiant energy. Such differences
may be recorded as variations of picture tone, colour or density.
Which enable discrimination of many spatial variables, for
example, on land different crop types or at sea water bodies of
contrasting depths or temperatures. The terms 'light', 'medium' or
'dark' are used to describe variations in tone.
 Spatial arrangement of visibly discernible objects
Repetitive patterns of both natural and cultural features are quite
common, which is fortunate because much image interpretation is
aimed at the mapping and analysis of relatively complex features
rather than the more basic units of which they may be composed. Such
features include agricultural complexes (e.g. farms and orchards) and
terrain features (e.g. alluvial river valleys and coastal plains).
Arrangement and frequency of tonal variation (closely associated
with tone)smooth irregular
 Same tone but different textures are possible!
Texture is an important image characteristic closely
associated with tone in the sense that it is a quality
thatpermits two areas of the same overall tone to be
differentiated on the basis of microtonal patterns. Common
image textures include smooth, rippled, mottled, lineated
and irregular. Unfortunately, texture analysis tends to be
rather subjective, since different interpreters may use the
same terms in slightly different ways. Texture is rarely the
only criterion of identification or correlation employed in
interpretation. More often it is invoked as the basis for a
subdivision of categories already established using more
fundamental criteria. For example:two rock units may have
the same tone but different textures.
•Relationship with other recognizable features in proximity to
target of interest
•At an advanced stage in image interpretation, the location of an
object with respect to terrain features of other objects may be
helpful in refining the identification and classification of certain
picture contents. For example some tree species are found more
commonly in one topographic situation than in others, while in
industrial areas the association of several clustered, identifiable
structures may help us determine the precise nature of the local
enterprise. For example, the combination of one or two tall
chimneys, a large central building, conveyors, cooling towers
and solid fuel piles point to the correct identification of a thermal
power station
 Capability to distinguish two closely spaced objects
Resolution of a sensor system may be defined as its capability to
discriminate two closely spaced objects from each other. More than
most other picture characteristics, resolution depends on aspects of the
remote sensing system itself, including its nature, design and
performance, as well as the ambient conditions during the sensing
programme and subsequent processing of the acquired data. An
interpreter must have a knowledge about the resolution of various
remote sensing data products.
9
• An IMAGE is a Pictorial Representation of an object or a
scene.
Analog
Digital
What is a Digital Image ?
• Produced by Electro optical Sensors
• Composed of tiny equal areas, or picture elements abbreviated
as pixels or peel arranged in a rectangular array
• With each pixel is associated a number known as Digital
Number ( DN) or Brightness value (BV) or gray level which is
a record of variation in radiant energy in discrete form.
• An object reflecting more energy records a higher number for
itself on the digital image and vice versa.
• Digital Images of an area captured in different spectral ranges
(bands) by sensors onboard a remote sensing satellite.
• •A pixel is referred by its column, row, band number.
• Digital image processing can be defined as the
computer manipulation of digital values contained in
an image for the purposes of image correction, image
enhancement and feature extraction.

• Digital Image Processing A digital image processing


system consists of computer Hardware and Image
processing software necessary to analyze digital
image data
• DIGITAL IMAGE PROCESSING SYSTEM FUNCTIONS

• {Compensates for data errors, noise and geometric


distortions introduced in the images during
acquisitioning and recording} i.e Preprocessing
(Radiometric and Geometric)

Image Enhancement
• {Alters the visual impact of the image on the
interpreter to improve the information content}
Information Extraction
• {Utilizes the decision making capability of
computers to recognize and classify pixels on the
basis of their signatures, Hyperspectral image
analysis }
Others
• {Photogrammetric Information Extraction ,
Metadata and Image/Map Lineage
Documentation , Image and Map Cartographic
Composition, Geographic Information Systems
(GIS), Integrated Image Processing and GIS ,
Utilities}
•ERDAS IMAGINE
•Leica Photogrammetry Suite
•ENVI
•IDRISI
•ER Mapper
•PCI Geomatica
•eCognition
•MATLAB
•Intergraph
RECTIFICATION
• is a process of geometrically correcting an image so
that it can be represented on a planar surface ,
conform to other images or conform to a map. i.e it
is the process by which geometry of an image is
made planimetric.
• It is necessary when accurate area , distance and
direction measurements are required to be made
from the imagery.
• It is achieved by transforming the data from one grid
system into another grid system using a geometric
transformation
• In other words process of establishing mathematical
relationship between the addresses of pixels in an
image with corresponding coordinates of those
pixels on another image or map or ground
• Two basic operations must be performed to
geometrically rectify a remotely sensed image to a
map coordinate system:
1.Spatial Interpolation: The geometric relationship
between input pixel location (row & column) and
associated map co-ordinates of the same point (x,y)
are identified.
• •This establishes the nature of the geometric co-
ordinate transformation parameters that must be
applied to rectify the original input image (x,y) to its
proper position in the rectified output image (X,Y).
• • Involves selecting Ground Control Points (GCPS)
and fitting polynomial equations using least squares
technique
• GROUND CONTROL POINT (GCP) is a
location on the surface of the Earth (e.g., a
road intersection) that can be identified on the
imagery and located accurately on a map.
• There are two distinct sets of coordinates
associated with each GCP: source or image
coordinates specified in i rows and j columns, and
Reference or map coordinates (e.g., x, y measured
in degrees of latitude and longitude, or meters in a
Universal Transverse Mercator projection).
• The paired coordinates (i, j and x, y) from many
GCPs can be modeled to derive geometric
transformation coefficients.
• These coefficients may be used to geometrically rectify
the remote sensor data to a standard datum and map
projection
• Accurate GCPs are essential for accurate rectification
• Sufficiently large number of GCPs should be selected
• Well dispersed GCPs result in more reliable rectification
• GCPs for Large Scale Imagery –Road intersections,
airport runways, towers buildings etc.
• for small scale imagery –larger features like Urban area
or Geological features can be used
• NOTE : landmarks that can vary (like lakes, other water
bodies, vegetation etc) should not be used.
• GCPs should be spread across image
• •Requires a minimum number depending on the type of
transformation
• Useful since many satellite images give inadequate
information for image interpretation. The contrast
stretch, density slicing, edge enhancement, and spatial
filtering are the more commonly used techniques.
• Image enhancement is attempted after the image is
corrected for geometric and radiometric distortions
RADIOMETRIC ENHANCEMENT
Modification of brightness values of each pixel in an
image data set independently (Point operations).
SPECTRAL ENHANCEMENT
Enhancing images by transforming the values of each
pixel on a multiband basis
SPATIAL ENHANCEMENT
Modification of pixel values based on the values of
surrounding pixels. (Local operations)

You might also like