You are on page 1of 49

Himalayan College, Puhana, Roorkee

Subject Teacher: Dr. Sachin Srivastava


Subject: Soil Taxonomy, Survey and Remote sensing Technology
Lecture No. 26
Date & Time: 21/12/2022, 11:20 Am-12:10 Pm
Remote sensing introduction, definition, concept, principles, importance, scope, types,
merits and demerits and its application in agriculture and soil classification.
Remote Sensing: is the collection of information relating to objects without being in physical
contact with them. Thus our eyes and ears are remote sensors, and the same is true for
cameras and microphones and for many instruments used for all kinds of applications
Or, said another way:
Remote sensing is the process of acquiring data/information about objects/substances not in
direct contact with the sensor, by gathering its inputs using electromagnetic radiation or
acoustical waves that emanate from the targets of interest. An aerial photograph is a common
example of a remotely sensed (by camera and film, or now digital) product.
In order for an observing sensor to acquire knowledge about remote object, there must be a
flow of information between the object and the observer. There has to be a carrier of that
information. In our case, the carrier is the electromagnetic radiation (EMR).

Figure 1: Electromagnetic radiation


Hence, the main elements in the process of data collection in remote sensing are the object to
be studied, the observer or sensor, the EMR that passes between these two, and the source of
the EMR.
The process of remote sensing involves an interaction between incident radiation and the
targets of interest. The figure above shows the imaging systems where the following seven
elements are involved. Note, however that remote sensing also involves the sensing of
emitted energy and the use of non-imaging sensors.
Introduction
The sun is a source of energy or radiation, which provides a very convenient source of energy
for remote sensing. The sun's energy is either reflected, as it is for visible wavelengths, or
absorbed and then reemitted, as it is for thermal infrared wavelengths.
There are two main types of remote sensing:
 Passive remote sensing
 Active remote sensing.

1-Passive sensors detect natural radiation that is emitted or reflected by the object or
surrounding area being observed. Reflected sunlight is the most common source of radiation
measured by passive sensors. Examples of passive remote sensors include film photography,
infrared, and radiometers.
2-Active remote sensing, on the other hand, emits energy in order to scan the objects and
areas where-upon a sensor then detects and measures the radiation that is reflected or
backscattered from the target. RADAR is an example o active remote sensing where the time
delay between emission and return is measured, establishing the location, height, speeds and
direction of an object.
Remote sensing makes it possible to collect data on dangerous or inaccessible areas. Remote
sensing applications include monitoring deforestation in areas such as the Amazon Basin, the
effects of climate change on glaciers and Arctic and Antarctic regions, and depth sounding of
coastal and ocean depths. Military collection during the cold war made use of stand-off
collection of data about dangerous border areas. Remote sensing also replaces costly and
slow data collection on the ground, ensuring in the process that areas or objects are not
disturbed.
A variety of different parts of the electromagnetic spectrum can be used including:
 Visible wavelengths and reflected infrared (imaging spectrometer): Remote
sensing in the visible and near infrared (VNIR) wavelengths usually falls into the
passive category. Here the sun is the source of the irradiance on the object being
observed. The sensor collects the solar radiation which is reflected by the object.
Active remote sensing occurs at these wavelengths only in the rare case where an
aircraft carries a laser as the source of illumination.
 Blue, green, and red are the primary colours or wavelengths of the visible
spectrum. They are defined as such because no single primary colour can be created
from the other two, but all other colours can be formed by combining blue, green and
red in various proportions.
 Thermal sensors: Remote sensing in the thermal-infrared wavelengths usually falls
into the passive category, but in this case, the source of the radiation is the object
itself. There is no irradiance and the sensor detects radiation which has been emitted
by the object.
 Microwaves (radar): These are used in the active remote sensing systems. The
satellite or aircraft carries an antenna which emits a microwave signal. This signal is
reflected by the ground and the return signal is detected again by the antenna.
The figure below shows the sections of the electromagnetic spectrum most commonly used in
remote sensing

Figure 3: The electromagnetic spectrum


Applications of Remote Sensing: There are probably hundreds of applications - these are
typical:
 Meteorology - Study of atmospheric temperature, pressure, water vapour, and wind
velocity.
 Oceanography: Measuring sea surface temperature, mapping ocean currents, and
wave energy spectra and depth sounding of coastal and ocean depths.
 Glaciology- Measuring ice cap volumes, ice stream velocity, and sea ice distribution.
(Glacial)
 Geology- Identification of rock type, mapping faults and structure.
 Geodesy- Measuring the figure of the Earth and its gravity field.
 Topography and cartography - Improving digital elevation models.
 Agriculture Monitoring the biomass of land vegetation.
 Forest- monitoring the health of crops, mapping soil moisture.
 Botany- forecasting crop yields.
 Hydrology- Assessing water resources from snow, rainfall and underground aquifers.
 Disaster warning and assessment - Monitoring of floods and landslides, monitoring
volcanic activity, assessing damage zones from natural disasters.
 Planning applications - Mapping ecological zones, monitoring deforestation,
monitoring urban land use.
 Oil and mineral exploration- Locating natural oil seeps and slicks, mapping
geological structures, monitoring oil field subsidence.
 Military- developing precise maps for planning, monitoring military infrastructure,
monitoring ship and troop movements.
 Urban- determining the status of a growing crop.
 Climate- the effects of climate change on glaciers and Arctic and Antarctic regions.
 Sea- Monitoring the extent of flooding.
 Rock- Recognizing rock types.
 Space program- is the backbone of the space program.
 Seismology: as a premonition.
Principles and Process of Remote Sensing
Remote sensing actually done from satellites as Landsat (The Landsat program is the
longest-running enterprise for acquisition of satellite imagery of Earth. It is a joint
NASA/USGS program. On July 23, 1972 the Earth Resources Technology Satellite was
launched. This was eventually renamed to Landsat.) or airplane or on the ground. To repeat
the essence of the definition above, remote sensing uses instruments that house sensors to
view the spectral, spatial and radiometric relations of observable objects and materials at a
distance. Most sensing modes are based on sampling of photons corresponding frequency in
the electromagnetic (EM) spectrum.
In much of remote sensing, the process involves an interaction between incident radiation
and the targets of interest. This is exemplified by the use of imaging systems where the
following seven elements are involved. Note, however that remote sensing also involves the
sensing of emitted energy and the use of non-emitted sensors.

i. Energy Source or Illumination (A) - The first requirement for remote sensing is to have
an energy source which illuminate or provides electromagnetic energy to the target of
interest.
ii. Radiation and the Atmosphere (B) - As the energy travels from its source to the target, it
will come in contact and interact with the atmosphere it passes through. This interaction may
take place a second time as the energy travels from the target to the sensor.
iii. Interaction with the Target (C) - Once the energy makes its way to the target through
the atmosphere, it interacts with the target depending on the properties of both the target and
the radiation.
iv. Recording of Energy by the Sensor (D) - After the energy has been scattered by, or
emitted from the target, we require a sensor (remote - not in contact with the target) to collect
and record the electromagnetic radiation.
v. Transmission, Reception, and Processing (E) - The energy recorded by the sensor has to
be transmitted, often in electronic form, to a receiving and processing station where the data
are processed.
vi. Interpretation and Analysis (F) - The processed image is interpreted, visually and/or
digitally or electronically, to extract information about the target, which was illuminated.
vii. Application (G) - The final element of the remote sensing process is achieved when we
apply the information we have been able to extract from the imagery about the target in order
to better understand it, reveal some new information, or assist in solving a particular problem.
Types of Remote Sensing System
1- Visual remote sensing system: The human visual system is an example of a remote
sensing system in the general sense.
 The sensors in this example are the two types of photosensitive cells, known as the
cones and the rods, at the retina of the eyes. The cones are responsible for colour
vision. There are three types of cones, each being sensitive to one of the red, green
and blue regions of the visible spectrum. Thus, it is not coincidental that the modern
computer display monitors make use of the same three primary colours to generate a
multitude of colours for displaying colour images. The cones are insensitive under
low light illumination condition, when their jobs are taken over by the rods. The rods
are sensitive only to the total light intensity. Hence, everything appears in shades of
grey when there is insufficient light.

 As the objects/events being observed are located far away from the eyes, the
information needs a carrier to travel from the object to the eyes. In this case, the
information carrier is the visible light, a part of the electromagnetic spectrum. The
objects reflect/scatter the ambient light falling onto them. Part of the scattered light is
intercepted by the eyes, forming an image on the retina after passing through the
optical system of the eyes. The signals generated at the retina are carried via the nerve
fibres to the brain, the central processing unit (CPU) of the visual system. These
signals are processed and interpreted at the brain, with the aid of previous
experiences. The visual system is an example of a "Passive Remote Sensing" system
which depends on an external source of energy to operate. We all know that this
system won't work in darkness.

2- Optical Remote Sensing: In Optical Remote Sensing, optical sensors detect solar
radiation reflected or scattered from the earth, forming images resembling photographs taken
by a camera high up in space.
 The wavelength region usually extends from the visible and near infrared VNIR to the
short-wave infrared SWIR. Different materials such as water, soil, vegetation,
buildings and roads reflect visible and infrared light in different ways.
 They have different colours and brightness when seen under the sun. The
interpretation of optical images requires the knowledge of the spectral reflectance
signatures of the various materials (natural or man-made) covering the surface of the
earth.
3-Infrared Remote Sensing
 Infrared remote sensing makes use of infrared sensors to detect infrared radiation
emitted from the Earth's surface. The middle-wave infrared (MWIR) and long-wave
infrared (LWIR) are within the thermal infrared region.
 These radiations are emitted from warm objects such as the Earth's surface. They are
used in satellite remote sensing for measurements of the earth's land and sea surface
temperature. Thermal infrared remote sensing is also often used for detection of forest
fires, volcanoes and oil fires.

4-Microwave Remote Sensing


 There are some remote sensing satellites which carry passive or active microwave
sensors. The active sensors emit pulses of microwave radiation to illuminate the areas
to be imaged. Images of the earth surface are formed by measuring the microwave
energy scattered by the ground or sea back to the sensors. These satellites carry their
own "flashlight" emitting microwaves to illuminate their targets. The images can thus
be acquired day and night.
 Microwaves have an additional advantage as they can penetrate clouds. Images can
be acquired even when there are clouds covering the earth surface. A microwave
imaging system which can produce high resolution image of the Earth is the synthetic
aperture radar (SAR). Electromagnetic radiation in the microwave wavelength
region is used in remote sensing to provide useful information about the Earth's
atmosphere, land and ocean. When microwaves strike a surface, the proportion of
energy scattered back to the sensor depends on many factors:
 Physical factors such as the dielectric constant (a quantity measuring the ability of a
substance to store electrical energy in an electric field) of the surface materials, which
also depends strongly on the moisture content;
 Geometric factors such as surface roughness, slopes, orientation of the objects
relative to the radar beam direction;
 The types of land cover (soil, vegetation or man-made objects).
 Microwave frequency
 Polarisation of microwave which shows polarisation of waves refers to the direction
in which the electric field of the wave is travelling. If the travelling waves all have
their electric field oscillating up and down they are said to be vertically polarised.
The microwave transmitter and receiver transmit and detect vertically polarised
microwaves.
 Incident angle of microwaves and the surface, which depends on the incident
angle of the radar pulse on the Earth surface. The incident angle of 23o for the ERS
SAR is optimal for detecting ocean waves and other ocean surface features.
5-Radar Remote Sensing

 Using radar, geographers can effectively map out the terrain of a territory. Radar
works by sending out radio signals, and then waiting for them to bounce off the
ground and return.
 By measuring the amount of time it takes for the signals to return, it is possible to
create a very accurate topographic map.
 An important advantage to using radar is that it can penetrate thick clouds and
moisture. This allows scientists to accurately map areas such as rain forests, which
are otherwise too hidden by clouds and rain.
 Imaging radar systems are versatile sources of remotely sensed images, providing
day-night, all-weather imaging capability. Radar images are used to map landforms
and geologic structure, soil types, vegetation and crops, ice and oil slicks on the ocean
surface.
Synthetic Aperture Radar (SAR)
 In synthetic aperture radar (SAR) imaging, microwave pulses are transmitted by an
antenna towards the earth surface. The microwave energy scattered back to the
spacecraft is measured.
 The SAR makes use of the radar principle to form an image by utilising the time delay
of the backscattered signals. In real aperture radar imaging, the ground resolution is
limited by the size of the microwave beam sent out from the antenna.
6-Satellite Remote Sensing

 In this, you will see many remote sensing images acquired by earth observation
satellites. These remote sensing satellites are equipped with sensors looking down to
the earth.
 They are the "eyes in the sky" constantly observing the earth as they go round in
predictable orbits. Orbital platforms collect and transmit data from different parts of
the electromagnetic spectrum, which in conjunction with larger scale aerial or ground-
based sensing and analysis provides researchers with enough information to monitor
trends.
 Other uses include different areas of the earth sciences such as natural resource
management, agricultural fields such as land usage and conservation, and national
security and overhead, ground-based and stand-off collection on border areas.

Satellites Acquiring Images Mechanism


Satellite sensors record the intensity of electromagnetic radiation (sunlight) reflected from the
earth at different wavelengths. Energy that is not reflected by an object is absorbed. Each
object has its own unique 'spectrum', some of which are shown in the diagram below.

 Remote sensing relies on the fact that particular features of the landscape such as
bush, crop, salt-affected land and water reflect light differently in different
wavelengths.
 Grass looks green, for example, because it reflects green light and absorbs other
visible wavelengths. This can be seen as a peak in the green band in the reflectance
spectrum for green grass above. The spectrum also shows that grass reflects even
more strongly in the infrared part of the spectrum. While this can't be detected by the
human eye, it can be detected by an infrared sensor.
 Instruments mounted on satellites detect and record the energy that has been reflected.
The detectors are sensitive to particular ranges of wavelengths, called 'bands'. The
satellite systems are characterised by the bands at which they measure the reflected
energy.
 The Landsat TM satellite, which provides the data used in this project, has bands at
the blue, green and red wavelengths in the visible part of the spectrum and at three
bands in the near and mid infrared part of the spectrum and one band in the thermal
infrared part of the spectrum. The satellite detectors measure the intensity of the
reflected energy and record it.

7- Airborne Remote Sensing


 In airborne remote sensing, downward or sideward looking sensors are mounted on an
aircraft to obtain images of the earth's surface. An advantage of airborne remote
sensing, compared to satellite remote sensing, is the capability of offering very high
spatial resolution images (20 cm or less).
 The disadvantages are low coverage area and high cost per unit area of ground
coverage. It is not cost-effective to map a large area using an airborne remote sensing
system. Airborne remote sensing missions are often carried out as one-time
operations, whereas earth observation satellites offer the possibility of continuous
monitoring of the earth.
8-Acoustic and near-acoustic remote sensing

 Sonar: Sonar (sound navigation ranging) is a technique that uses sound propagation
(usually underwater, as in submarine navigation) to navigate, communicate with or
detect objects on or under the surface of the water, such as other vessels. Sonar, short
for Sound Navigation and Ranging, is helpful for exploring and mapping the ocean
because sound waves travel farther in the water than do radar and light waves. NOAA
(National Ocean Atmospheric Administration) scientists primarily use sonar to
develop nautical charts, locate underwater hazards to navigation, search for and map
objects on the seafloor such as shipwrecks, and map the seafloor itself. There are two
types of sonar—active and passive.
Active sonar transducers emit an acoustic signal or pulse of sound into the water. If an object
is in the path of the sound pulse, the sound bounces off the object and returns an “echo” to the
sonar transducer. If the transducer is equipped with the ability to receive signals, it measures
the strength of the signal. By determining the time between the emission of the sound pulse
and its reception, the transducer can determine the range and orientation of the object.
Passive sonar systems are used primarily to detect noise from marine objects (such as
submarines or ships) and marine animals like whales. Unlike active sonar, passive sonar does
not emit its own signal, which is an advantage for military vessels that do not want to be
found or for scientific missions that concentrate on quietly “listening” to the ocean. Rather, it
only detects sound waves coming towards it. Passive sonar cannot measure the range of an
object unless it is used in conjunction with other passive listening devices. Multiple passive
sonar devices may allow for triangulation of a sound source.
 Seismographs are instruments used to record the motion of the ground during an
earthquake. They are installed in the ground throughout the world and operated as part
of a seismographic network. This did not, however, record earthquakes; it only
indicated that an earthquake was occurring. Seismograms taken at different locations
can locate and measure earthquakes (after they occur) by comparing the relative
intensity and precise timing.
Advantages of Remote Sensing: The basic advantages of remote sensing are listed below:
 A relatively cheap and rapid method of acquiring up-to-date information over a large
geographical area.
 It is the only practical way to obtain data from inaccessible regions, e.g., Antarctica,
Amazonia.
 At small scales, regional phenomena which are invisible from the ground are clearly
visible (e.g., beyond man's visibility); for example, faults and other geological
structures.
 Cheap and rapid method of constructing base maps in the absence of detailed land
surveys.
 Easy to manipulate with the computer and combine with other geographic coverage’s
in the GIS (Geographic Information System).
Disadvantages of Remote Sensing: The basic disadvantages of remote sensing are given
below:
 In this method we cannot take direct samples of the phenomenon, so they must be
calibrated against reality. This calibration is never exact; a classification error of 10%
is excellent.
 They must be corrected geometrically and geo-referenced in order to be useful as
maps, not only as pictures.
 Distinct phenomena can be confused if they look the same to the sensor, leading to
classification error — for example, artificial and natural grass in green light.
 Phenomena which were not meant to be measured can interfere with the image and
must be accounted for.
 Resolution of satellite imagery is too coarse for detailed mapping and for
distinguishing small contrasting areas.
Scope
 Multi-spectral and Hyper-spectral remote sensing:
 Multi-spectral imagery is produced by sensors that measure reflected energy
within several specific sections (also called bands) of the electromagnetic
spectrum.
 Multi-spectral sensors usually have between 3 and 10 different band
measurements in each pixel of the images they produce. Examples of bands in
these sensors typically include visible green, visible red, near infrared light
etc. Landsat, Quick bird and Spot satellites are well-known satellite sensors
that use multi-spectral sensors.
 Hyper-spectral sensors measure energy in narrower and more numerous bands
than multispectral sensors. Hyper-spectral images can contain as many as 200
(or more) adjacent spectral bands.
 The numerous narrow bands of hyper-spectral sensors provide a continuous
spectral measurement across the entire electromagnetic spectrum and therefore
are more sensitive to slight variations in reflected energy.
 Images produced from hyper-spectral sensors contain much more data than
images from multi-spectral sensors and have a greater potential to detect
differences among land and water features. For example, multispectral
imagery can be used to map forested areas, while hyper-spectral imagery can
be used to map tree species within the forest.
 Active and passive microwave remote sensing:
 Microwave sensing encompasses both active and passive forms of remote
sensing. The microwave portion of the spectrum covers the range from
approximately 1cm to 1m in wavelength. Because of their long wavelengths,
compared to the visible and infrared, microwaves have special properties that
are important for remote sensing.
 Longer wavelength microwave radiation can penetrate through cloud
cover, haze, dust and all but the heaviest rainfall as the longer wavelengths
are not susceptible to atmospheric scattering which affects shorter optical
wavelengths.

 Passive microwave sensing is similar in concept to thermal remote sensing. All


objects emit microwave energy of some magnitude, but the amounts are generally
very small. A passive microwave sensor detects the naturally emitted microwave
energy within its field of view.
 This emitted energy is related to the temperature and moisture properties of the
emitting object or surface. Passive microwave sensors are typically radiometers or
scanners and operate with an antenna is used to detect and record the microwave.
 Active microwave sensors provide their own source of microwave radiation to
illuminate the target. Active microwave sensors are generally divided into two distinct
categories: imaging and non-imaging. The most common form of imaging active
microwave sensors is RADAR.
 RADAR is an acronym for RAdio Detection And Ranging, which essentially
characterizes the function and operation of a radar sensor. The sensor transmits a
microwave (radio) signal towards the target and detects the backscattered portion of
the signal. The strength of the back scattered signal is measured to discriminate
between different targets and the time delay between the transmitted and reflected
signals determines the distance (or range) to the target.
 Lidar and laser scanning: Lidar, which stands for Light Detection and Ranging, is
a remote sensing method that uses light in the form of a pulsed laser to measure
ranges (variable distances) to the Earth.
These light pulses—combined with other data recorded by the airborne system — generate
precise, three-dimensional information about the shape of the Earth and its surface
characteristics.
A lidar instrument principally consists of a laser, a scanner, and a specialized GPS receiver.
Airplanes and helicopters are the most commonly used platforms for acquiring lidar data over
broad areas. Two types of lidar are topographic and bathymetric. Topographic lidar typically
uses a near-infrared laser to map the land, while bathymetric lidar uses water-penetrating
green light to also measure seafloor and riverbed elevations.
 The acronyms are Laser RADAR, which stands for Radio Detection And
Ranging; and LIDAR which stands for Light Detection And Ranging. The
major difference being the wavelength of the signal and the divergence of the
signal beam.
 LIDAR is typically a collimated light beam with minimal divergence over
long distances from the transmitter; where RADAR is cone shaped signals
emerge out from the source. Both calculate distance by comparing the time it
takes for the outgoing wave or pulse to return to the source.
 In laser scanners and LIDAR systems, the environment is usually scanned
with a pulsed laser beam and the reflection time of the signal from the object
back to the detector is measured. The Time-of-Flight (TOF) reflection time
measurement can be used over distances ranging from one meter up to several
kilometres. To increase the range of the systems, very short laser pulses in the
invisible NIR range are used. These enable a higher laser power compared to
continuous wave lasers while still complying with eye safety requirements.
 During the scanning process, the systems gather individual distance points
within an aggregate of points, from which three-dimensional images of the
environment can be computed. The laser scanners deflect the laser beam using
deflecting mirrors, which enables them to achieve very wide fields of vision.
Some LIDAR systems also rotate around their own axis and offer 360° all-
round visibility. Modern devices achieve very high data rates with over one
million distance points per second.
 Geometric reconstruction:
 Remotely sensed images usually contain geometric distortions (Geometric
distortion is an error on an image, between the actual image coordinates and
the ideal image coordinates which would be projected theoretically with an
ideal sensor and under ideal conditions) so significant that they cannot be used
directly with base map products in a geographic information system (GIS).
 Consequently, multi-source data integration (raster and vector) for
cartographic applications, such as forestry, requires geometric and
radiometric processing adapted to the nature and characteristics of the data in
order to keep the best information from each image in the composite ortho-
rectified image.
 Physical modelling and signatures: Signatures from five remote sensing domains—
spectral, spatial, angular, temporal and polarization—provide the basis for the
description and discrimination of Earth surfaces and their variability. These signatures
have been used for a wide range of terrestrial applications.
 Change detection:
 In the context of remote sensing, change detection refers to the process of
identifying differences in the state of land features by observing them at
different times. This process can be accomplished either manually (i.e., by
hand) or with the aid of remote sensing software.
 Manual interpretation of change from satellite images or aerial photos involves
an observer or analyst defining areas of interest and comparing them between
images from two dates. This may be accomplished either on-screen (such as in
a GIS) or on paper. When analyzing aerial photographs, a stereoscope which
allows for two spatially-overlapping photos to be displayed in 3D, can aid
photo interpretation.
 Image processing and pattern recognition: Image pattern recognition is the
problem of exploring how to recognize image patterns. An image pattern recognition
system generally consists of four parts: a camera that acquires the image samples to
be classified, an image pre-processor that improves the qualities of images, a feature
extraction mechanism that gains discriminative features from images for recognition
and a classification scheme that classifies the image samples based on the extracted
features.
 Data fusion and data assimilation:
 Data fusion is the process of integrating multiple data sources to produce
more consistent, accurate and useful information than that provided by any
individual data source.
 Data fusion processes are often categorized as low, intermediate or high,
depending on the processing stage at which fusion takes place. Low-level data
fusion combines several sources of raw data to produce new raw data.
 The expectation is that fused data is more informative and synthetic than the
original inputs. For example, sensor fusion is also known as (multi-sensor)
data fusion and is a subset of information fusion.
 Dedicated satellite missions:
 ESA (European Space Agency) is developing a series of next-generation Earth
observation missions, on behalf of the joint ESA/European Commission
initiative GMES (Global Monitoring for Environment and Security).
 The goal of the Sentinel program is to replace the current older Earth
observation missions which have reached retirement, such as the ERS mission,
or are currently nearing the end of their operational life span. This will ensure
a continuity of data so that there are no gaps in ongoing studies.
 Each mission will focus on a different aspect of Earth observation;
Atmospheric, Oceanic and Land monitoring. The data will be of use in many
applications.
 Operational processing facilities:
 Facilities operations include the management of all the processes, people, tools
and assets that are required for a facility to fully perform remote sensing
process as it is supposed to.
 Facility operations typically include the day-to-day operations of the facility,
as well as getting ready for and executing future maintenance and
improvement needs.
 The operational procedures and requirements of each facility will vary
depending on the industry in question. Facility operations may also be referred
to as "facility management" or "facilities management."
 Space borne, airborne and terrestrial platforms:
 Ground based -- A wide variety of ground based platforms are used in remote
sensing. Some of the more common ones are hand held devices, tripods, towers
and cranes. Instruments that are ground-based are often used to measure the
quantity and quality of light coming from the sun or for close range
characterization of objects. For example, to study properties of a single plant or a
small patch of grass, it would make sense to use a ground based instrument.
 Airborne -- Airborne platforms were the sole non-ground-based platforms for
early remote sensing work. The first aerial images were acquired with a camera
carried aloft by a balloon in 1859. Balloons are rarely used today because they are
not very stable and the course of flight is not always predictable, although small
balloons carrying expendable probes are still used for some meteorological
research. At present, airplanes are the most common airborne platform. Nearly the
whole spectrums of civilian and military aircraft are used for remote sensing
applications. When altitude and stability requirements for a sensor are not too
demanding, simple, low-cost aircraft can be used as platforms. However, as
requirements for greater instrument stability or higher altitudes become necessary,
more sophisticated aircraft must be used.
 Space borne: The most stable platform aloft is a satellite, which is space borne.
The first remote sensing satellite was launched in 1960 for meteorology purposes.
Now, over a hundred remote sensing satellites have been launched and more are
being launched every year. The Space Shuttle is a unique spacecraft that functions
as a remote sensing satellite and can be reused for a number of missions. Satellites
can be classified by their orbital geometry and timing. Three orbits commonly
used for remote sensing satellites are geostationary, equatorial and Sun
synchronous. A geostationary satellite has a period of rotation equal to that of
Earth (24 hours) so the satellite always stays over the same location on Earth.
Communications and weather satellites often use geostationary orbits with many
of them located over the equator. In an equatorial orbit, a satellite circles Earth at
a low inclination (the angle between the orbital plane and the equatorial plane).
The Space Shuttle uses an equatorial orbit with an inclination of 57 degrees.
 Remote sensing applications
Remote Sensing applications in Agriculture
Introduction
 Agriculture resources are among the most important renewable, dynamic natural
resources. Comprehensive, reliable and timely information on agricultural resources is
very much necessary for a country like India whose mainstay of the economy is
agriculture.
 Agriculture surveys are presently conducted throughout the nation in order to gather
information and associated statistics on crops, rangeland, livestock and other related
agricultural resources. This information of data is most importance for the
implementation of effective management decisions at local, panchayat and district
levels. In fact, agricultural survey is a backbone of planning and allocation of the
limited resources to different sectors of the economy.
 With increasing population pressure throughout the nation and the concomitant need
for increased agricultural production (food and fibre crops as well as livestock) there
is a definite need for improved management of the nation agricultural resources. In
order to accomplish this, it is first necessary to obtain reliable data on not only the
types, but also the quality, quantity and location of these resources.
Remote sensing and its Importance in Agricultural survey
Remote sensing is nothing but a means to get the reliable information about an object without
being in physical contact with the object. It is on the observation of an object by a device
separated from it by some distance utilizing the characteristics response of different objects to
emissions in the electromagnetic spectrum.
Present system of Generating agricultural data and its Problems: The present system of
agricultural data is collected throughout the nation.
 The main responsibility of collection agricultural survey lies on the Director of Land
Records, Director of agriculture and District Statistical Office under the Ministry of
Agriculture.
 These data are collected not only on a local but also some extent of district and state
level. The associate of agricultural survey on crops (crop production, type of crop and
crop yield), range land (condition of range, forest type, water quality, types of
irrigation system and soil characteristics) and livestock (livestock population, sex of
animal, types of farm and distribution of animals).
The basic problems in survey are;
Reliability of data
Cost and benefits
Timeless
Incomplete sample frame and sample size
Methods of selection
Measurement of area
Non sampling errors
Gap in geographical coverage
Non availability of statistics at disaggregated level.
Remote Sensing techniques make it use before the remote sensing data may provide solution
to these particular problems of agricultural survey.
Advantages of Remote Sensing techniques in Agricultural survey
With the primary aim of improving the present means of generating agricultural data, a
number of specific advantages may result from the use of remote sensing techniques.
1. Vantage point: Because the agricultural landscape depends upon the sun as a source of
energy, it is exposed to the aerial view and, consequently, is ideally suited or remote sensing
techniques.
2. Coverage: With the use of high-altitude sensor platforms, it is now possible to record
extensive areas on a single image. The advent of high-flying aircraft and satellites, single
high quality images covering thousands of square miles.
3. Permanent record: After an image is obtained, it serves as a permanent record of a
landscape at a point in time which agriculture changes can be monitored and evaluated.
4. Mapping Base: Certain types of remote sensing imagery are, in essence, pictorial maps of
the landscape and after rectification (if needed), allow for precise measurement (such as field
acreages) to be made on the imagery, obviating time-consuming on the ground surveys.
These images may also aid ground data sampling by serving as a base map for location
agriculture features while in the field, and also as a base for the selection of ground sampling
point or areas.
5. Cost savings: The costs are relatively small when compared with the benefits, which can
be obtained from interpretation of satellite imagery.
6. Real-time capability: Imagery can be obtained and interpreted may help to eliminate the
lock of timeliness which outbreak, so many agricultural survey.
Other advantages of Remote Sensing
 Easy data acquisition over inaccessible area.
 Data acquisition at different scales and resolutions.
 The images are analyzed in the laboratory, thus reducing the amount of fieldwork.
 Colour composites can be produced from three individual band images, which provide
better details of the area then a single band image or aerial photograph.
 Stereo-satellite data may be used for three-dimensional studies. At present, all
advantages listed above have been demonstrated either operationally or
experimentally:
Application of Remote sensing techniques for Agricultural survey
The specific application of remote sensing techniques can be used for
i) Detection: Remote sensing technology has long been used to detect and map crop
diseases. Airborne and satellite imagery acquired during growing seasons can be used
not only for early detection and within-season management of some crop diseases, but
also for the control of recurring diseases in future seasons. With variable rate
technology in precision agriculture, site-specific fungicide application can be made to
infested areas if the disease is stable, although traditional uniform application is more
appropriate for diseases that can spread rapidly across the field.
ii) Identification of Crops: Government agencies and agricultural managers require
information on the spatial distribution and area of cultivated crops for planning
purposes. Agencies can more adequately plan the import and export of food products
based on such information. Although some ministries of agriculture and food security
annually commission their staff to map different crop types, these ground surveys are
expensive and yet cover only a sample of farms. Remote sensing data, together with
ancillary information, enable the determination of the spatial distribution of crops at
varying spatial scales with relatively little financial resources. This can, however, be
achieved with various degrees of uncertainty, as some crops are spectrally similar and
pixel sizes sometimes bias the estimation of crop acreages. Knowledge of the growth
cycle or calendar of crops is essential for accurate interpretation of RS data. The
cropping calendar covers the period from land preparation, planting, growth,
pollination, senescence to harvesting. The spectral reflectance of crops at each of
these growth stages is called the temporal profile of the crop. This information is
extracted from satellite images based on surveyed locations (boundaries) of dominant
crop types in the area of interest. Dominant crop types are surveyed during field
campaigns that ideally coincide with the timing of satellite image acquisition.
iii) Measurement: The unprecedented availability of high resolution (spatial, spectral
and temporal) satellite images has promoted the use of remote sensing in many PA
applications, including crop monitoring, irrigation management, nutrient application,
disease and pest management, and yield prediction.
iv) Monitoring of agricultural phenomena: For monitoring of agriculture process,
single tabular form of data or map data is not sufficient enough. Remote sensing can
provide combined information’s obtained from existing maps and tabular data.
 Remote Sensing techniques using various platforms that have provide its utility in
agricultural survey.
 Satellite data provides the actual synoptic view of large area at a time, which is not
possible from conventional survey methods.
 The process of data acquisition and analysis is very fast through Geographic
Information System (GIS) as compared to conventional methods.
 Remote Sensing techniques have a unique capability of recording data in visible as
well as invisible (i.e. ultraviolet, reflected infrared, thermal infrared and microwave
etc.) part of electromagnetic spectrum.
 Therefore certain phenomenon, which cannot be seen by human eye, can be observed
through remote sensing techniques i.e. the trees, which are affected by disease, or
insect attack can be detected by remote sensing techniques much before human eyes
see them.
Area of specific applications
a) Applicable to crop survey
1. Crop identification: Advances in satellite imagery now allow economies of scale in the
calculation of paddock position, area measurement and mapping, when compared with
aircraft-based photography.
 Crop recognition, growth and health monitoring are now also considered achievable
goals using satellite imagery.
2. Crop acreage: Crop acreage and yield prediction can be done either by using the RS data
alone or by incorporating RS data in crop growth and development simulation models.
 The NDVI (Normally Difference Vegetation Index) is a strong indicator of the total
green biomass at any time.
As shown below, Normalized Difference Vegetation Index (NDVI) uses the NIR and red
channels in its formula.

Healthy vegetation (chlorophyll) reflects more near-infrared (NIR) and green light compared
to other wavelengths. But it absorbs more red and blue light.
This is why our eyes see vegetation as the color green. If you could see near-infrared, then it
would be strong for vegetation too. Satellite sensors like Landsat and Sentinel-2 both have
the necessary bands with NIR and red.
Image courtesy of NASA.
The result of this formula generates a value between -1 and +1. If you have low reflectance
(or low values) in the red channel and high reflectance in the NIR channel, this will yield a
high NDVI value and vice versa.
Overall, NDVI is a standardized way to measure healthy vegetation. When you have high
NDVI values, you have healthier vegetation.
When you have low NDVI, you have less or no vegetation. Generally, if you want to see
vegetation change over time, then you will have to perform an atmospheric correction.
In India, with the launch of IRS 1A, in 1988, a national level program called Crop
Acreage Production Estimation (CAPE), under which area and production estimation
of major crops, at district and state level, was carried out.
 Based on experience gained under the CAPE project, the Ministry of Agriculture
launched the “FASAL” program in 2007. The use of satellite RS and GIS data has
proved to be more cost-effective, reliable, timely, and faster than the conventional
ground-based surveys of agricultural resources.
 Satellite-based crop yield estimation still needs more work. There are some issues
with the assessment of crops, including the non-availability of data for Kharif season
and limited accuracy in yield estimates, especially for horticultural crops.
 Crop simulation models do not represent spatial features because of large spatial
variability in the soil, rainfall, management conditions. With the availability of very
high-resolution satellite data with high temporal frequency, newer techniques such as
Machine Learning, AI, and Big data Analytics and higher availability of UAV based
observation, it is possible that in near future satellite remote sensing will play the
major role in crop production estimation.
3. Crop vigour: The fundamental aim of appliance of remote sensing in cultivation is to
conclude vegetation characteristics such as Crop vigour Index.
4. Crop density: Plant stand density, or plant population, is an important crop growth
parameter.
 Remote sensing by the help of Aerial hyper-spectral remote sensing imagery can
estimate the crop density in growing season.
 The imagery had a spatial resolution of 1 m and a spectral resolution of 3 nm between
498 nm and 855 nm.
 A machine vision system for early-season measurement of plant stand density was
also used to map every row of plant within the different field area, and a complete
inventory of plants was generated as a rich ground reference dataset.
5. Crop maturity
6. Growth rates
7. Yield forecasting
8. Actual yield
9. Soil fertility
10. Effects of fertilizes
11. Soil toxicity
12. Soil moisture
13. Water quality
14. Irrigation requirement
15. Insect infestations
16. Disease infestations
17. Water availability
18. Location of canals
b) Applicable to range survey
1. Delineation of forest types
2. Condition of range
3. Carrying capacity
4. Forage
5. Time of seasonal change
6. Location of water
7. Water quality
8. Soil fertility
9. Soil moisture
10. Insect infestations
11. Wildlife inventory
c) Applicable to livestock survey
1. Cattle population
2. Sheep population
3. Pig population
4. Poultry Population
5. Age sex distribution
6. Distribution of animals
7. Animal behaviour
8. Disease identification
9. Types of farm buildings
1-Agriculture
 Agricultural Research Service scientists have developed a way to incorporate LIDAR
with yield rates on agricultural fields. This technology will help farmers direct their
resources toward the high-yield sections of their land.
 LIDAR also can be used to help farmers determine which areas of their fields to apply
costly fertilizer. LIDAR can create a topological map of the fields and reveals the
slopes and sun exposure of the farm land.
 Researchers at the Agricultural Research Service blended this topological information
with the farm land’s yield results from previous years. This technology is valuable to
farmers because it indicates which areas to apply the expensive fertilizers to achieve
the highest crop yield.
3-Biology and conservation
 LIDAR has also found many applications in forestry. Canopy heights, biomass
measurements, and leaf area can all be studied using airborne LIDAR systems.
 Similarly, LIDAR is also used by many industries, including Energy and Rail road,
and the Department of Transportation as a faster way of surveying.
 Topographic maps can also be generated readily from LIDAR, including for
recreational use such as in the production of orienteering maps.
 In oceanography, LIDAR is used for estimation of phytoplankton fluorescence and
generally biomass in the surface layers of the ocean.
 Another application is airborne lidar bathymetry of sea areas too shallow for hydro
graphic vessels.
4-Geology and soil science
 High-resolution digital elevation maps generated by airborne and stationary LIDAR
have led to significant advances in geomorphology, the branch of geoscience
concerned with the origin and evolution of Earth's surface topography.
 LIDAR's abilities to detect subtle topographic features such as river terraces and river
channel banks, measure the land surface elevation beneath the vegetation canopy,
better resolve spatial derivatives of elevation, and detect elevation changes between
repeat surveys have enabled many novel studies of the physical and chemical
processes that shape landscapes. Aircraft-based.
 LIDAR and GPS have evolved into an important tool for detecting faults and
measuring uplift. The output of the two technologies can produce extremely accurate
elevation models for terrain that can even measure ground elevation through trees.
 5-Hydrology: LIDAR offers a lot of information to the aquatic sciences.
 High-resolution Digital elevation maps generated by airborne and stationary LIDAR
have led to significant advances in the field of Hydrology.
6-Meteorology and atmospheric environment
 The first LIDAR systems were used for studies of atmospheric composition,
Structure, clouds, and aerosols. Initially based on ruby lasers, LIDAR for
meteorological applications was constructed shortly after the invention of the laser
and represent one of the first applications of laser technology.
 Elastic backscatter LIDAR is the simplest type of lidar and is typically used for
studies of aerosols and clouds. The backscattered wavelength is identical to the
transmitted wavelength, and the magnitude of the received signal at a given range
depends on the backscatter coefficient of scatterers at that range and the extinction
coefficients of the scatterers along the path to that range. The extinction coefficient is
typically the quantity of interest.
 Differential Absorption LIDAR (DIAL) is used for range-resolved measurements of a
particular gas in the atmosphere, such as ozone, carbon dioxide, or water vapour. The
LIDAR transmits two wavelengths: an "on-line" wavelength that is absorbed by the
gas of interest and an off-line wavelength that is not absorbed. The differential
absorption between the two wavelengths is a measure of the concentration of the gas
as a function of range. DIAL LIDARs are essentially dual-wavelength backscatter
LIDARS.
 Raman LIDAR is also used for measuring the concentration of atmospheric gases, but
can also be used to retrieve aerosol parameters as well. Raman LIDAR exploits
inelastic scattering to single out the gas of interest from all other atmospheric
constituents. A small portion of the energy of the transmitted light is deposited in the
gas during the scattering process, which shifts the scattered light to a longer
wavelength by an amount that is unique to the species of interest.
 Doppler LIDAR is used to measure wind speed along the beam by measuring the
frequency shift of the backscattered light. Scanning LIDARs, have been used to
measure atmospheric wind velocity in a large three dimensional cone.
Remote Sensing of Soil: Over the past few decades, the Earth’s surface has witnessed major
changes in land use. These changes are likely to continue, driven by demographic pressure or
by climate change.
 In this context, monitoring tools are needed for maintaining a sustainable ecological
status, improving soil conservation and water resource management.
 Floods, excess runoff, soil erosion, and related contamination and disequilibrium of
the water and carbon cycles are, among others, key issues that are controlled and
influenced by soil surface characteristics.
 The implementation of sustainable agricultural, hydrological, and environmental
management requires an improved understanding of the soil, at increasingly finer
scales.
 Conventional soil sampling and laboratory analyses cannot efficiently provide this
information, because they are slow, expensive, and could not retrieve all temporal and
spatial variability’s.
 Remote sensing has shown a high potential in soil characteristics retrieving in the last
three decades. Different methodologies have been proposed for the estimation of soil
parameters, based on different remote sensing sensors and techniques (passive and
active).
 For passive remote sensing, we can consider four principal types of sensors:
(i) Optical remote sensing with limited number of bands (e.g., SPOT, ASTER,
LANDSAT etc.) particularly adapted for vegetation cover description, land use
analysis.
(ii) Optical remote sensing based on hyper-spectral sensors, particularly adapted for
soil texture description.
(iii) Optical remote sensing with thermal infrared band, adapted for soil temperature
estimation.
(iv) Passive microwave remote sensing adapted to soil moisture and vegetation
estimation.
For active remote sensing, different studies have shown a considerable potential for the
characterization of different soil parameters: moisture, roughness and texture.
 Active remote sensing is particularly based on two types of sensors: synthetic aperture
radar (SAR) with high spatial resolution adapted to local and regional studies and
scatter-meter sensor more adapted to global estimations of soil parameters.
 Three types of methodologies are generally used for soil parameters estimation:
empirical models based only on satellite and ground databases, semi-empirical models
based on a mixture between physical modelling and real data, and finally physical
models based only on the description of radiative transfer physics to analyze
relationship between remote sensing signals and soil parameters.
 These remote sensing studies concern particularly four soil parameters (moisture,
roughness, temperature, and texture).
(i) Soil moisture is a key parameter, influencing the manner in which rainwater is
shared between the phenomena of evapo-transpiration, infiltration, and runoff.
(ii) Soil surface roughness is involved in the separation of water flow into
infiltration and runoff. Moreover, monitoring the evolution of surface
roughness is a way to estimate erosion risk particularly in agricultural areas.
(iii) Soil texture is one of the most important soil properties influencing most
physical, chemical, and biological soil processes. Hence, it is a key property
for soil management.
(iv) Soil temperature is a key parameter in the description of evapo-transpiration
and surface-atmosphere interface processes.
Based on this high potential of remote sensing to retrieve surface parameters, a high number
of sensors have been launched in the last years to improve different methodologies proposed
to retrieve operationally surface parameters.
 A technique for estimating soil moisture based on the support vector regression
algorithm and the integration of ancillary data, using active remote sensing
(RADARSAT 2 SAR data).
 The impact of soil moisture on gross primary production (GPP), chlorophyll content,
and canopy water content represented by remotely sensed vegetation indices (VIs) in
an open grassland and an oak savanna in California.
 Using SAR (RADARSAT-1) remote sensing, we can provide a classification of
frozen/unfrozen soils. It was developed to produce frozen soil maps under snow
cover.
 Landsat ETM+ imagery and high-resolution (5 m) terrain (IFSAR) data method to
map soils. They characterize soil classes mapped using this semi-automated
technique. The method distinguished spectrally distinct soil classes that differed in
subsurface rather than surface properties.
 (Airborne Visible/Infrared Imaging Spectrometer) for mapping and quantifying
mineralogical components of soils. They showed to be possible mapping and
quantifying the weathering degree of the studied soils.
 Airborne hyper-spectral imagery to map tilled agricultural fields properties. Soil
hyper-spectral reflectance imagery was obtained using an airborne imaging
spectrometer (400–2450 nm, ~10 nm resolution, 2.5 m spatial resolution). The
resulting raster maps showed variation associated with topographic factors, indicating
the effect of soil redistribution and moisture regime on in-field spatial variability.
Remote Sensing Components
1. A Geographic Information System (GIS) integrates hardware, software, and data
for capturing, managing, analyzing, and displaying all forms of geographically
referenced information. GIS also allows the integration of these data sets for deriving
meaningful information and outputting the information derivatives in map format or
tabular format.
OR
Geographic Information Systems (GIS)store, analyze, and visualize data for
geographic positions on Earth’s surface.GIS is a computer-based tool that examines
spatial relationships, patterns, and trends. By connecting geography with data, GIS
better understands data using a geographic context.
The 4 main ideas of Geographic Information Systems (GIS) are:
 Create geographic data.
 Manage it in a database.
 Analyze and find patterns.
 Visualize it on a map.

Three Views of a GIS


A GIS can be viewed in three ways:
1) The Database View: A GIS is a unique kind of database of the world—a geographic
database (geo database). It is an "Information System for Geography." Fundamentally, a GIS
is based on a structured database that describes the world in geographic terms.
2) The Map View: A GIS is a set of intelligent maps and other views that show features and
feature relationships on the earth's surface. Maps of the underlying geographic information
can be constructed and used as "windows into the database" to support queries, analysis, and
editing of the information.
3) The Model View: A GIS is a set of information transformation tools that derive new
geographic datasets from existing datasets. These geo-processing functions take information
from existing datasets, apply analytic functions, and write results into new derived datasets.
By combining data and applying some analytic rules, we can create a model that helps answer
the question you have posed.
2. Global Positioning System GPS: The Global Positioning System (GPS) is a space-based
global navigation satellite system (GNSS) that provides reliable location and time
information in all weather and at all times and anywhere on or near the Earth when and where
there is an unobstructed line of sight to four or more GPS satellites. It is maintained by the
United States government and is freely accessible by anyone with a GPS receiver.
GPS was created and realized by the U.S. Department of Defence (USDOD) and was
originally run with 24 satellites. It was established in 1973 to overcome the limitations of
previous navigation systems.
Basic concept of GPS
 A GPS receiver calculates its position by precisely timing the signals sent by GPS
satellites high above the Earth. Each satellite continually transmits messages that
include
 The time when message was transmitted
 Precise orbital information (the ephemeris)
 The general system health and rough orbits of all GPS satellites (the almanac).
 The receiver uses the messages it receives to determine the transit time of each
message and computes the distance to each satellite. These distances along with the
satellites' locations are used with the possible aid of trilateration, depending on which
algorithm is used, to compute the position of the receiver. This position is then
displayed, perhaps with a moving map display or latitude and longitude; elevation
information may be included.
 Many GPS units show derived information such as direction and speed, calculated
from position changes. Three satellites might seem enough to solve for position since
space has three dimensions and a position near the Earth's surface can be assumed.
However, even a very small clock error multiplied by the very large speed of light, the
speed at which satellite signals propagate results in a large positional error.
 Therefore receivers use four or more satellites to solve for the receiver's location and
time. The very accurately computed time is effectively hidden by most GPS
applications, which use only the location.
 A few specialized GPS applications do however use the time; these include time
transfer, traffic signal timing, and synchronization of cell phone base stations.
Although four satellites are required for normal operation, fewer apply in special
cases. If one variable is already known, a receiver can determine its position using
only three satellites. For example, a ship or aircraft may have known elevation.
 Some GPS receivers may use additional clues or assumptions (such as reusing the last
known altitude, dead reckoning, inertial navigation, or including information from the
vehicle computer) to give a less accurate (degraded) position when fewer than four
satellites are visible.
AERIAL PHOTOGRAPHS
An aerial photograph is any photograph taken from an airborne vehicle (aircraft, drones,
balloons, satellites, and so forth). The aerial photograph has many uses in military
operations; however, for the purpose of this manual, it will be considered primarily as a map
supplement or map substitute.
COMPARISON WITH MAPS: A topographic map may be obsolete because it was
compiled many years ago. A recent aerial photograph shows any changes that have taken
place since the map was made. For this reason, maps and aerial photographs complement
each other. More information can be gained by using the two together than by using either
alone.
a. Advantages. An aerial photograph has the following advantages over a map:
(1) It provides a current pictorial view of the ground that no map can equal.
(2) It is more readily obtained. The photograph may be in the hands of the
user within a few hours after it is taken; a map may take months to prepare.
(3) It may be made for places that are inaccessible to ground soldiers.
(4) It shows military features that do not appear on maps.
(5) It can provide a day-to-day comparison of selected areas, permitting
evaluations to be made of enemy activity.
(6) It provides a permanent and objective record of the day-to-day changes
with the area.
b. Disadvantages. The aerial photograph has the following disadvantages as
compared to a map:
(1) Ground features are difficult to identify or interpret without symbols and
are often obscured by other ground detail as, for example, buildings in wooded
areas.
(2) Position location and scale are only approximate.
(3) Detailed variations in the terrain features are not readily apparent without
overlapping photography and a stereoscopic viewing instrument.
(4) Because of a lack of contrasting colors and tone, a photograph is difficult
to use in poor light.
(5) It lacks marginal data.
(6) It requires more training to interpret than a map.
TYPES OF AERIAL PHOTOGRAPHS: Aerial photography most commonly used by
military personnel may be divided into two major types, the vertical and the oblique. Each
type depends upon the attitude of the camera with respect to the earth's surface when the
photograph is taken.
a. Vertical. A vertical photograph is taken with the camera pointed as straight
down as possible (Figures 8-1 and 8-2). Allowable tolerance is usually + 3° from
the perpendicular (plumb) line to the camera axis. The result is coincident with
the camera axis. A vertical photograph has the following characteristics:

Figure 8-1. Relationship of the vertical aerial photograph with the ground.
Figure 8-2. Vertical photograph.
General Characteristics of Vertical Aerial Photograph
(1) The lens axis is perpendicular to the surface of the earth.
(2) It covers a relatively small area.
(3) The shape of the ground area covered on a single vertical photo closely
approximates a square or rectangle.
(4) Being a view from above, it gives an unfamiliar view of the ground.
(5) Distance and directions may approach the accuracy of maps if taken over
flat terrain.
(6) Relief is not readily apparent.
b. Low Oblique. This is a photograph taken with the camera inclined about 30°
from the vertical (Figure 8-3, and Figure 8-4). It is used to study an area before an
attack, to substitute for a reconnaissance, to substitute for a map, or to supplement a
map. A low oblique has the following characteristics:

Figure 8-3. Relationship of low oblique photograph to the ground.


Figure 8-4. Low oblique photograph.

General Characteristics of Low oblique Photograph


(1) It covers a relatively small area.
(2) The ground area covered is a trapezoid, although the photo is square or
rectangular.
(3) The objects have a more familiar view, comparable to viewing from the
top of a high hill or tall building.
(4) No scale is applicable to the entire photograph, and distance cannot be
measured. Parallel lines on the ground are not parallel on this photograph;
therefore, direction (azimuth) cannot be measured.
(5) Relief is discernible but distorted.
(6) It does not show the horizon.
c. High Oblique. The high oblique is a photograph taken with the camera inclined
about 60° from the vertical (Figures 8-5 and 8-6). It has a limited military application;
it is used primarily in the making of aeronautical charts. However, it may be the only
photography available. A high oblique has the following characteristics:

Figure 8-5. Relationship of high oblique photograph to the ground.


Figure 8-6. High oblique photograph.
General Characteristics of High oblique Photograph
(1) It covers a very large area (not all usable).
(2) The ground area covered is a trapezoid, but the photograph is square or
rectangular.
(3) The view varies from the very familiar to unfamiliar, depending on the
height at which the photograph is taken.
(4) Distances and directions are not measured on this photograph for the
same reasons that they are not measured on the low oblique.
(5) Relief may be quite discernible but distorted as in any oblique view. The
relief is not apparent in a high altitude, high oblique.
(6) The horizon is always visible.
d. Trimetrogon. This is an assemblage of three photographs taken at the same time,
one vertical and two high obliques, in a direction at right angle to the line of flight.
The obliques, taken at an angle of 60° from the vertical, side lap the vertical
photography, producing composites from horizon to horizon (Figure 8-7).
Figure 8-7. Relationship of cameras to ground for trimetrogon photography (three
cameras).
e. Multiple Lens Photography. These are composite photographs taken with one
camera having two or more lenses, or by two or more cameras. The photographs are
combinations of two, four, or eight obliques around a vertical. The obliques are
rectified to permit assembly as verticals on a common plane.
f. Convergent Photography. These are done with a single twin-lens, wide-angle
camera, or with two single-lens, wide-angle cameras coupled rigidly in the same
mount so that each camera axis converges, when intentionally tilted a prescribed
amount (usually 15 or 20°) from the vertical. Again, the cameras are exposed at the
same time. For precision mapping, the optical axes of the cameras are parallel to the
line of flight, and for reconnaissance photography, the camera axes are at high angles
to the line of flight.
g. Panoramic. The development and increasing use of panoramic photography in
aerial reconnaissance has resulted from the need to cover in greater detail more and
more areas of the world.
(1) To cover the large areas involved, and to resolve the desired ground
detail, present-day reconnaissance systems must operate at extremely high-
resolution levels. Unfortunately, high-resolution levels and wide-angular
coverage are basically contradicting requirements.
(2) A panoramic camera is a scanning type of camera that sweeps the terrain
of interest from side to side across the direction of flight. This permits the
panoramic camera to record a much wider area of ground than either frame or
strip cameras. As in the case of the frame cameras, continuous cover is
obtained by properly spaced exposures timed to give sufficient overlap
between frames. Panoramic cameras are most advantageous for applications
requiring the resolution of small ground detail from high altitudes.
Measure the distance between two distinct features
• The two features must appear on both the map and aerial photograph.
• Use human features eg. road junctions, buildings.
• A greater distance between the two features will produce a more accurate answer.
• Try and measure a distance on the photo that is a whole number
Example: Map Scale is 1:50 000
1 Measure the direct distance between the same two points on the aerial photo eg. 10cm
2 Measure the direct distance between two points on the map eg. 6.5cm
3 Ratio of Scales = Ratio of Distances

LARGE SCALE photos or maps show a SMALLER AREA but in MORE DETAIL eg. 1:10
000
SMALL SCALE photos or maps show a LARGER AREA but in LESS DETAIL eg. 1:500
000
Indian Satellites used in Agriculture: Production forecasting of important
agricultural crops using satellite remote sensing data initiated under the project “Crop
Acreage and Production Estimation” by Space Applications Centre and carried out
over a period of two decades at the request of Ministry of Agriculture, Govt. of India
(MOA).
 FASAL (Forecasting agricultural output using Space, Agro-meteorology and
Land based observations) was initiated in 2007-08, and SAC (Space
Application Centre) entrusted with the responsibility of implementing Space
technology based production forecast of crops and up gradation of the
procedure with new data availability. Procedures for multiple forecast of 6
crops viz: Jute, Kharif rice, winter potato, rapeseed/mustard, Wheat and Rabi
rice has been done.
 Realising the need to integrate this advanced technique in the routine crop
statistics gathering, Department of Agriculture and Cooperation (DAC),
Ministry of Agriculture, Govt. of India initiated steps to set up a centre for this
purpose.
 Accordingly, the centre “Mahalanobis National Crop Forecast Centre”
(MNCFC) has been set up at PUSA, Rajendra nagar, New Delhi. Thus, the
centre will be carrying out the operational forecast of the cops from 2012-13
seasons onwards.
 SAC is now concentrating on developmental activities related to crop
forecasting for other crops viz., sugarcane, cotton, soybean, pulses etc. Use of
new sensors provides procedure development, parameter retrieval, biomass
estimation, pests and diseases assessment etc.
 One major work component identified under FASAL for 2012 is use of the
new Indian C बैंड Synthetic Aperture Radar (SAR) data from Radar Imaging
Satellite (RISAT-1) on various aspects of crop production forecast and
condition assessment.
 The current FASAL R&D activities include Techniques for Early Area
Estimation of new crops –Sugarcane (using AWiFS), Cotton, Soybean (using
SAR), INSAT CCD–NDVI for product generation and experimentation on
International Crop Assessment using EO (Earth Observation) Data.
 Beyond FASAL, SAC is involved in National Food Security Mission (NFSM
of DAC) – initiations such as Bringing Green Revolution to Eastern India
(Monitoring the impact on rice production), forecasting disease and damage
dissection (Yellow rust in wheat), Pulse crop intensification assessment and
Techniques Development for Prototype for crop insurance.

Factors related to Agriculture Production


The Government of India does indeed have a successful intra-governmental SAAS (satellite-
as-a-service) ecosystem for its own use. It uses Earth observation and meteorological data
from indigenously built remote sensing satellites for agriculture applications in the country.
Fundamental to this governmental ecosystem is the Indian Space Research Organization
(ISRO), which builds satellites and payloads and processes the satellite data. ISRO distributes
volumes of state-specific satellite data to independent nodal bodies within all state
governments known as the State Remote Sensing Application Centres (SRSACs).
The SRSACs receive vast volumes of ISRO satellite data ranging from multispectral,
cartographic and radar imaging and meteorological observations and forecasting applications
(Table 1). The SRSACs process and analyse the satellite data and share it with the irrigation,
forestry, land-use and agriculture ministries and departments of state governments among
others.
Table 1: Remote sensing satellites of Indian Space Research Organization involved in
agriculture applications.

Satellite Type Satellite Objectives

Multispectral imaging satellite Resourcesat Multispectral imaging for crop


2 & Resourcesat-2A production forecast, land, water and
natural resource inventory and
management, and disaster management
support.

Cartography satellite Cartosat-1 High resolution cartographic mapping,


digital elevation mapping – drainage and
irrigation networks, topographic mapping
and contouring.

Radar imaging RISAT-1 All weather imaging capability targeted


for kharif crop (June to November)
during south-west and north-east
monsoon seasons. Flood and natural
disaster management.

Meteorological forecasting Kalpana-1 Comprehensive weather status reporting


and forecasting.

Meteorological observation INSAT-3D & INSAT-3DR Improved meteorological observations


including vertical – temperature and
humidity – atmosphere weather
forecasting and disaster warning.

Remote sensing Satellites and its application for agricultural development


A. Crop Production Forecasting: Crop production is entirely based on the two independent
parameters includes crop area and yield estimates. In crop production four main processes are
available for crop identification which includes stratification, area estimates through area
frame sampling, area estimates through remote sensing, area estimates through administration
level. Earth observation data can be used to monitor the cropping area.
Name of Satellite Application
RISAT -1SAR Satellite data is used for rice production.
R2 LISS III Satellite data used for sugarcane production.
R2AWiFS For wheat production satellite data used.
For crop production, the ministry of agriculture and corporation from the government of India
running the projects named as Crop Acreage and Production Estimation. Based on the
weather and remote sensing data CAPE project provides crop production.
B. Horticulture and cropping system analysis: To increase the horticulture remote sensing
technology advised to formers by merging the process of multi-layer cropping, intercropping,
off season cultivation, relay cropping. To increase the horticulture the following techniques
are more useful to utilize time and space management.
C. Crop condition assessment and stress detection: Remote sensing data helps to monitor
crop health condition. Crop spectral reflectance properties are varies based on crop
phenology, stages of crop growth, crop health. This crop condition could be monitored and
measured using multispectral satellite data. The advantage of optical VIR sensor is to provide
highly sensitive wavelengths, using these wavelengths easily identify crop vigor, crop
damage and crop stress.
Crop growing season and crop conditions are clearly identified by the remote sensing satellite
data. The sensor includes LST, LSWI, TVDI, NDVI are used to identify the crop condition
assessment. For assessment of crop growth and crop, stress needs high spatial and temporal
resolution on remote sensing satellite data.
D. Irrigation monitoring and management: Remote sensing satellite data are the essential
one to monitor the problems of the irrigated area. Multi-temporal imagery and ancillary data
are useful methods to improve the identification of irrigated area using remotely sensed
satellite data. Multi –temporal imagery and ancillary data are included climate, soil. The
distribution of the irrigation is entirely based on climate changes, it determines crop demand
and crop schedules. To avoid the irrigation ground water is considered as a better source.

E. Soil Mapping: Mapping of minerals in soil is made by hyper spectral remote sensing
method. The efficient method to the mapping of soil moisture is microwave. The common
microwave configuration is SAR (Synthetic Aperture Radar). Soil mapping depends upon the
spectral reflectance of soil characteristics like colour, texture, structure, mineralogy, organic
matter, free carbonates, salinity, moisture etc. Satellite sensors used for soil mapping includes
IRS WiFS, IRS-LISS-II, IRS-LISS-III, IRS PAN.
F. Droughts Assessment and Monitoring: Agriculture droughts provide crop stress and
wilting, it could not support crop growth and maturity due to inadequate rainfall and soil
moisture. A mechanism for rainfall prediction and drought early warning gives solution for
drought management problem. The satellites/sensors include Resourcesat1, Resourcesat2,
AVHRR, MODIS are carried out the problems of drought assessment and monitoring system.
Droughts stages can be divided into three different levels, 10 to 20 percent has been
considered as the low level of droughts. 26 to 50 percent has considered as a medium level of
droughts and above 50 percent has considered as a high level of droughts.

G. Land Cover
and Land
Degradation
Mapping: Land degradation created by overgrazing, deforestation and practices of
inappropriate irrigation. The most active land degradation factors are salinization, compaction
and water logging. The absence of conservation measurements, heavy machinery usage of
improper time, human intervention in natural drainage, excessive irrigation are the main
factors of causing human induced land degradation.
For mapping land degradation area the following sensors used which includes TM, ETM+,
MSS, ERS, JERS-1, Radarsat. To mapping small surface area the sensors used with high
spatial resolution like IKONOS, SPOT-5, Hyperion, AVIRIS.
H. Crop identification: Crop identification is an essential process to increase the growth of
crop in the agriculture field. There are two steps approaches are available for crop
identification includes training sample generation and classification technique.
The satellites include LISS, AWIFS, SPOT 5, MODIS, LANDSAT are the better source for
providing multispectral data to precede the crop identification process. Satellites like
AVIRIS, Hy-Map are the good source for providing hyper spectral data for crop
identification.
I. Crop nutrient deficiency detection: To identify the nutrient deficiency in crop need to
know some data includes soil type, cropping history, regional geography. Once the nutrient
deficiency has occurred and identified, farmers should take steps to resolve it. Some
important issues are made by crop nutrient deficiency which includes lack of pH, Flooding or
poor drainage, drought stress, disease damage, nematode damage, herbicide drift.
The CropSAT, Sentinel and Fertisat are some satellites that offer farmers the possibility to
improve the efficiency of nutrient fertilization, using variable rate application (VRA)
technologies.
J. Identification of Pests and Disease Infestation: The prediction of pest damage is done by
spectral indices based on leaf pigments, optical and video imaging in microwave region and
near infrared, multi spectral remote sensing, and areas identified with help of portable GPS
equipment.
Remote sensing satellite data used to monitor the disease and pest with the help of following
sensors which are tabulated in the table 6.
SATELLITES DESIGNED FOR AGRICULTURE – TECHNICAL ASPECT: Remote
sensing satellite data can play a vital role to manage the agriculture field as better.
Agriculture factors include plant health, plant cover, soil moisture is monitored by the
satellite data with GIS environment. The technical details of agriculture supported satellites
are depicted clearly. Some of them are displayed in the table below.

Remote Sensing and Soil Classification


Applications of Remote Sensing in Agriculture include a number of aspects such as plant
biology, economic features, and land use management. These applications have been playing
vital role and it suggests that remote sensing technology will be a more powerful tool for
monitoring agricultural activities from the Earth Observation. The predictable ground
methods of land use mapping are very time consuming, labour intensive and are relatively
uncommon. The prepared maps became out of date, particularly in a forceful or rapidly
changing environment with the initiation of remote sensing technology, mainly satellite
remote sensing techniques have been developed and that proved to be of enormous value in
preparing land use or land cover mapping and monitoring the changes at periodic time
intervals.
Earth’s surface has witnessed major changes in land use. These changes are likely to
continue, driven by demographic pressure or by climate change. In this context, monitoring
tools are needed for maintaining a sustainable ecological status, improving soil conservation
and water resource management. Floods, excess runoff, soil erosion, and related
contamination and disequilibrium of the water and carbon cycles are, among others, key
issues that are controlled and influenced by soil surface characteristics. The implementation
of sustainable agricultural, hydrological, and environmental management requires an
improved understanding of the soil, at increasingly finer scales. Conventional soil sampling
and laboratory analyses cannot efficiently provide this information, because they are slow,
expensive, and could not retrieve all temporal and spatial variabilities.

You might also like