You are on page 1of 101

Remote Sensing for Agricultural Applications

Zemede Mulushewa (PhD)


December 2023

Zemede Mulushewa (PhD) 1


What is agriculture?
Is the practice of crop production and animal raring including poultry production and bee
keeping through utilization of natural resources sustainably.

Crop production is the science and art of growing and harvesting crops (including
preserving seeds, clearing land from other vegetation, land preparation, sowing,
plant protection from natural enemies, harvesting, processing and storing of
products).

Zemede Mulushewa (PhD) 2


Cropping: is the yearly sequence and spatial arrangement of crops on a given area.
There are different cropping patterns used in different places depending on the potential of the area.
In planning a cropping sequence, a farmer must take several considerations in to account: including
The soil type
The local weather (rainfall, temperature, etc.)
Weed, disease and pest problem
Market outlets or market demand including transportation access

Zemede Mulushewa (PhD) 3


Things we can do with RS

1. Crop area estimation


Satellite images (possibly classified) can be used as basic information for area
estimation.
Area is estimated by counting pixels in a classified image.
Sources of area estimation error:
% Mixed pixels
Misclassification of pure pixels (e.g. due to radiometric error)
Resolution

Zemede Mulushewa (PhD) 4


!Do not use pixel counting if you want an accuracy of ± 5%. (or unless you are
confident that your classification accuracy is >90%).

Question: what can we do with these errors??

Answer:
Use higher resolution imagery taken during the cropping season.
Use intensive ground surveying (proper sampling) to get accurate area measurement.

Zemede Mulushewa (PhD) 5


2. Crop stress monitoring

There are wide ranges of stresses that affect plant functioning and of ways in which

the effects of these stresses can be detected by RS.

In principle we do not usually image the stress itself – what we most frequently

study is a built-in plant response to the stress.

Zemede Mulushewa (PhD) 6


a. Thermal sensing : sensors track temperature changes.
Thermal sensing is primarily used to study plant water relations, and specifically stomatal
conductance, because a major determinant of leaf temperature is the rate of evaporation or
transpiration from the leaf.

The cooling effect of transpiration arises because a substantial amount of energy (the latent
heat of vaporization, λ) is required to convert liquid water to water vapour, and this energy is
then taken away from the leaf in the evaporating water and therefore cools it.

Zemede Mulushewa (PhD) 7


In rare cases leaf temperature may be affected by other physiological processes: for example the

heat generated as water in a leaf freezes can be readily imaged, while in extreme cases of

particularly high respiratory rates raised temperatures can be used as a measure of these

increased respiration rates.

In most cases, however, the heat generated by respiration is too small in quantity to have a

detectable effect on leaf temperature.

Zemede Mulushewa (PhD) 8


b. Multi-sensor imaging for diagnosis

 Individual imaging sensor provides only limited information, indicating

changes in only one or two intermediary responses,

 but that response can be caused by a wide range of primary stresses, ability

to diagnose the particular primary stress is greatly enhanced by the

combination of two or more imaging technologies.

Zemede Mulushewa (PhD) 9


• For example: thermal imaging responds primarily to changes in evaporation rate,
which are generally caused by changes in stomatal aperture, but stomatal closure
can be a result of stresses as different as drought, flooding, salinity stress, fungal
infection or pollutants. In order to distinguish between these possible causes one
needs further information.
• Therefore, there has been recent emphasis on the development of multi-sensor
imaging to aid in stress diagnosis and monitoring.

Zemede Mulushewa (PhD) 10


• The approaches can range from simple combination of, say, thermal and reflectance sensors, or visible

reflectance and fluorescence sensors, through to combined fluorescence, reflectance and thermal

imaging sensors.

It is clear from the above that there is enormous potential for combining different imaging technologies for

the diagnosis and quantification of both abiotic and biotic stresses in plants.

By combining information from a wide range of sensors, each of which detects a different basic

physiological response, it should be possible to greatly enhance our sensitivity at diagnosing and

quantifying different stresses.

Zemede Mulushewa (PhD) 11


Why use RS& GIS for agriculture applications?
• Timely, objective, local to global coverage
• Useful for observing areas that are inaccessible
• Monitor plant growth and estimate crop
productivity
• Assess soil moisture and irrigation requirements
• Identify soil and crop characteristics and
conditions
• Better forecast precipitation and crop disease
• Maximize crop yields while reducing energy
consumption Evolution of agricultural operations in the Wadi As-Sirhan
• Avoid waste of farm inputs (water, fertilizer, and Basin, Saudi Arabia. Captured by Landsat satellites 4, 5, and 7 in
1987, 1991, 2000, and 2012. Taken by Thematic Mapper and
pesticide) Enhanced Thematic Mapper Plus. Image Credit: NASA

Zemede Mulushewa (PhD) 12


Applications for Agriculture & WSM
 Crop Monitoring
– Phenology, crop area, crop type, crop condition, yield, irrigated landscape,
flood, drought, frost, accurate and timely reporting of agricultural statistics

 Crop Forecasting
– Accurate forecasting of yield or shortfalls in crop production and food supply
per region and country

 Market Stability related to agricultural production


– Lowers uncertainty and increases the transparency of global food supply
– Reduces price volatility by anticipating market trends with reduced uncertainty

 Humanitarian Aid when agricultural production failed


– Monitor food security in high-risk regions worldwide
– Early warning of famine, enabling the timely mobilization of an international
response in food aid
Zemede Mulushewa (PhD) 13
1.2. Geospatial data for agricultural resources & watershed
management
1.3. Geospatial Data sources and Data Collection Methods

Zemede Mulushewa (PhD) 14


Bands

Bands
Parts of the EMS where sensors record data
Specifications are sensor specific

RS sensors can collect data in all portions


of the EM spectrum
Multispectral sensors have spectral
sensitivity limitations (spectral resolution)
The wavelength ranges recorded by sensors are called
bands or “channels”, varies with sensor

Zemede Mulushewa (PhD) 15


Bands… Landsat

Landsat collects data in 11 “channels” throughout the EM


spectrum

Zemede Mulushewa (PhD) 16


Landsat 8 Spectral Bands
Landsat 8 Operational Land Image (OLI) and Thermal Infrared Sensor (TIRS)
Band Wavelength Useful for Mapping
Band 1 - coastal aerosol 0.43-0.45 Coastal and aerosol studies
Band 2 - blue 0.45-0.51 Bathymetric mapping, distinguishing soil from vegetation and deciduous from
coniferous vegetation
Band 3 - green 0.53-0.59 Emphasizes peak vegetation, which is useful for assessing plant vigor
Band 4 - red 0.64-0.67 Discriminates vegetation slopes
Band 5 - Near Infrared 0.85-0.88 Emphasizes biomass content and shorelines
(NIR)
Band 6 - Short-wave 1.57-1.65 Discriminates moisture content of soil and vegetation; penetrates thin clouds
Infrared (SWIR) 1
Band 7 - Short-wave 2.11-2.29 Improved moisture content of soil and vegetation; penetrates thin clouds
Infrared (SWIR) 2
Band 8 - Panchromatic 0.50-0.68 15 meter resolution, sharper image definition
Band 9 - Cirrus 1.36-1.38 Improved detection of cirrus cloud contamination
Band 10 - TIRS 1 10.60-11.19 100 meter resolution, thermal mapping and estimated soil moisture
Band 11 - TIRS 2 11.50-12.51 100 meter resolution, improved thermal mapping and estimated soil moisture

Zemede Mulushewa (PhD) 17


Landsat Missions

Landsat 7
Band Wavelength
1 0.45 to 0.52 Blue
2 0.52 to 0.60 Green
3 0.63 to 0.69 Red
4 0.76 to 0.90 Near IR
5 1.55 to 1.75 Short Wave IR
6 10.40 to 12.50 Thermal IR
6 2.08 to 2.35 Short Wave IR

Zemede Mulushewa (PhD) 18


Sentinel 2 vs Landsat
Landsat-8
1 coastal aerosol; 2 blue; 3
green; 4 red; 5
NIR; 6 SWIR 1; 7 SWIR 2; 8
60 m Panchromatic; 9 cirrus; 10
20 m TIRS 1; 11 TIRS 2.
10 m
Sentinel-2:
1 coastal aerosol; 2 blue; 3
green; 4 red; 5, 6, 7, and 9
vegetation red edge; 8 NIR;
10 cirrus; 11 and 12 SWIR.

• Sentinel 2 has three different spatial resolutions


• Data Acquisition (requires free registration)
• Visible and IR – 10 meter
• SWIR (red edge) and Longwave IR – 20 meter
– https://earthexplorer.usgs.gov/
• Aerosols, Water Vapor, and Cirrus bands – 60 meter

• Landsat 7 is 30m, Landsat 8 is 30m except for a few 100 m TIRS bands & the Panchromatic Band
Zemede Mulushewa (PhD) 19
Sentinel 2 , Landsat , ASTER & Worldview 2

WorldView-3:
1 coastal; 2 blue; 3
green; 4 yellow; 5 red; 6
red edge; 7 NIR 1; 8 NIR
2; 9–16 SWIR.

ASTER : 1 green; 2 red; 3N/3B—NIR nadir-looking/NIR backward-looking; 4–9 SWIR; 10–14 TIR.
Zemede Mulushewa (PhD) 20
Copernicus and Sentinel

European Space Agency’s (ESA) program for earth observation

Copernicus is the overarching program that is managing the Sentinel satellites

Sentinel Satellites
7 individual Satellite programs (or constellations) each with two satellites and a specific purpose related to
earth monitoring

Zemede Mulushewa (PhD) 21


Sentinel Satellites
Sentinel-1 (April 2014, 2016)
Sentinel-3 (February 2016, April 2018)
Marine Observation (sea surface
Land and Ocean monitoring
topography, sea and land temperature,
2 polar-orbiting satellites
ocean and land color)
Radar Imaging
2 polar-orbiting satellites–
Operate day and night
2 Optical Imaging
All Weather
1 Altimeter – Radar Imaging
Sentinel- 4 & 5
Air quality, Ozone, Solar radiation, Climate
Sentinel-2 (June 2015, March 2017) monitoring in Europe (4);
UVN Spectrometer instrument aboard
Land monitoring (Vegetation, Soil, Coastal
area)
2 polar-orbiting satellites
Optical Imaging
Dependent on solar radiation
Impacted by clouds Sentinel- 5p (October 2017)
Cover data gaps, Data Continuity
Atmospheric monitoring
Zemede Mulushewa (PhD) 22
Sentinel 2

Sentinel 2 is a Landsat clone/upgrade

Uses: Forest monitoring, change detection, land use/land cover

Coverage: Currently every 5 days- Collection began June 23, 2015 (every 10 days
prior to 2017)

Spatial resolution: 10, 20 and 60 meters (wavelength dependent)

More spectral bands than Landsat 7 and 8


Sentinel 2 has 13 bands,
Landsat 8 has 11 bands,
Landsat 7 has 7 bands

Zemede Mulushewa (PhD) 23


Sentinel 2 vs Landsat

Sentinel 2 Landsat 8

See detail application of Sentinel:

https://www.sentinel-hub.com/explore/industries-and-showcases/agriculture/

https://www.sentinel-hub.com/explore/education/custom-scripts-tutorial/

Zemede Mulushewa (PhD) 24


Advanced Very High Resolution Radiometer (AVHRR)
 AVHRR : used to generate NDVI images of large portions of Earth on regular basis to
provide global images that portray seasonal and annual changes to vegetative cover.
 Primary differences between AVHRR and Landsat NDVI is resolution.

AVHRR resolution is 1km and NDVI is 8 km

Landsat NDVI resolution is 30 m

 AVHRR data - frequent global NDVI products

Landsat 7 ETM+ data :


o greater detail covering less area

Zemede Mulushewa (PhD) 25


Satellites & Sensors for Agricultural Applications
Scientific Products
Land Surface Evapotranspiration Land Surface Precipitation Soil Vegetation
Satellite Sensor Reflectance Temperature Moisture Greenness

Terra MODIS X X X X
Aqua MODIS X X X X
NOAA-20 VIIRS X X X
Landsat 8 OLI X X
Sentinel 2 MSI X X
Landsat 8 & HLS X X
Sentinel 2 ECOSTRESS X

Land Data Assimilation Modeled X X


System output
Global Precipitation GMI, DPR X
Measurement (GPM)
CHIRPS Multiple X
Soil Moisture X
Active L-band radar
Passive C-band radar
Multi-Spectral Instrument (MSI)
Visible Infrared Imaging Radiometer Suite (VIIRS) Operational Land Imager (OLI)
Moderate Resolution Imaging Spectroradiometer (MODIS) GPM Microwave Imager (GMI)
Harmonized Landsat
Zemede Mulushewaand
(PhD)Sentinel-2 (HLS) 26
Dual-frequency Precipitation Radar (DPR)
Characteristics of sensors with potential for RS

For details https://www.mdpi.com/2076-3417/10/5/1785

Sentinel-1A spacecraft was launched on April 3, 2014, Sentinel-1B spacecraft, a twin sister of Sentinel-1A, on April 25, 2016.
See details @ https://www.eoportal.org/satellite-missions/copernicus-sentinel-1#mission-status
Zemede Mulushewa (PhD) 27
Common Remote Sensing Indices
This course focuses on the following Common Remote Sensing Indices:
1) Vegetation Indices
2) Water Indices
3) Landscape Indices

Common Remote Sensing Indices: Vegetation


Index Name Equation Purpose/ Application
NDVI ((NIR - Red)/(NIR + Red)) Standardized index to generate an image
(Normalized Difference displaying greenness (relative biomass)
Vegetation Index)
SAVI ((NIR - Red) / (NIR + Red + L)) x Vegetation index that attempts to minimize soil
(Soil-Adjusted Vegetation (1 + L) brightness influences.
Index) L = amount of green veg. cover
VARI (Green - Red)/ (Green + Red - Emphasized vegetation in the visible portion of
(Visible Atmospherically Blue) the spectrum, while mitigating illumination
Resistant Index) differences and atmospheric effects
Zemede Mulushewa (PhD) 28
Common Remote Sensing Indices: Water

Index Name Equation Purpose/ Application


NDSI (Green - SWIR) / Identifies snow cover while
(Normalized Difference Snow (Green + SWIR) ignoring cloud cover.
Index) Designed for MODIS and
Landsat data.
MNDWI (Green - SWIR) / Enhances open water
(Modified Normalized Difference (Green + SWIR) features.
Water index)
NDMI (NIR - SWIR1)/(NIR Sensitive to moisture levels
(Normalized Difference Moisture + SWIR1) in vegetation. Used to
Index) monitor droughts and fuel
levels in fire prone areas.
Zemede Mulushewa (PhD) 29
Common Remote Sensing Indices: Landscape

Index Name Equation Purpose/ Application


BAI 1/((0.1 -RED)^2 + (0.06 - Identify areas of terrain
(Burn Area Index) NIR)^2) affected by fire
NBR (NIR - SWIR) / (NIR+ SWIR) Emphasized burned areas,
(Normalized Burn while mitigating illumination
Ratio Index) and atmospheric effects
NDBI (SWIR - NIR) / (SWIR + NIR) Emphasized man-made built-
(Normalized up areas. Mitigates effects of
Difference Built-up terrain illumination and
Index) atmospheric effects

Zemede Mulushewa (PhD) 30


Land Surface Reflectance
• Provide an estimate of surface spectral reflectance as
measured at ground level by accounting for atmospheric
effects like aerosol scattering and thin clouds
• Useful for measuring the greenness of vegetation, which can
then be used to determine phenological transition dates
including:
 start of season,
 peak period, and end of season
• Input for generation of several land products: Three moments in a tumultuous year for farming
north of St. Louis, MO, as seen in NASA-USGS
• Vegetation Indices (VIs), Landsat 8 data. On the left is May 7, 2019, as
heavy rains delayed planting for many farms.
• Bidirectional Reflectance Distribution Function (BRDF), Sept 12, 2019, in the middle, shows bright green
thermal anomaly, snow/ice, signifying growing vegetation, although with a
fair amount of brown, bare fields. On the right,
• Fraction of Absorbed Photosynthetically Active Oct. 14, 2019, the light brown indicates harvested
fields while darker brown are fields that have not
Radiation (FAPAR), & been seeded or fallow all summer.
• Leaf Area Index (LAI) Image Credit: NASA-USGS Landsat

Zemede Mulushewa (PhD) 3


1
Moderate Resolution Imaging Spectroradiometer (MODIS)

– MODIS instrument on two NASA platforms:


• Terra (1999-present)
• Aqua (2002-present)
– Spatial Resolution: 250 m, 500 m, 1 km
– Spectral Resolution: 36 bands,
– ranging in wavelengths from
0.4 µm to 14.4 µm
– Temporal Resolution: Daily, 8-day, 16-day,monthly,
MODIS Orbital Coverage Image Credit: NASA
yearly (2000 – Present)
– Global coverage with orbital gaps in the tropics

Zemede Mulushewa (PhD) 32


MODIS….
• MODIS
– MOD09: Level 2 & 3 surface reflectance
product name
– Bands 1 and 2 (250m)
– Bands 1-7 (500 m)
– Bands 1-16 (1 km)

• MODIS Surface Reflectance User’s Guide


– http://modis- sr.ltdri.org/guide/MOD09_UserGuide_v1.4.pdf

• Data Acquisition (requires free registration)


MODIS composite of the eastern U.S. captured on
– https://search.earthdata.nasa.gov/search?q= February 23, 2020 by NASA’s Aqua satellite.
MOD09 Image credit: NASA Earth Observatory

Zemede Mulushewa (PhD) 33


Landsat 8 – OLI
• Landsat 8 launched on February 11, 2013
– Instruments(2):
• Operational Land Imager (OLI)
• Thermal Infrared Sensor (TIRS)
– Landsat data have been produced,
archived, and distributed by the U.S.
Geological Survey (USGS) since
1972
– Builds on ~50 years of the Landsat
program (i.e., the longest continuous
global record of the Earth’s surface)
– https://landsat.gsfc.nasa.gov/landsat- Landsat 8 OLI image captured on September 9, 2013 showing the border
data-continuity-mission/ between Kazakhstan and China
Image Credit: NASA Earth Observatory

People often say that borders are not visible from space. But the line between Kazakhstan and China couldn't be
clearer ifZemede
it were drawn
Mulushewa on the sand.
(PhD) 34
RGB = RED, GREEN, BLUE
True Color Composite Images displayed using any bands other
than RGB (true color)
Allows users to visualize wavelengths that
the human eye cannot see
Many different combinations, depending on
the application

Figure reference
Zemede Mulushewa (PhD) 35
Band Combinations for Landsat 8

Composite Name Bands

Natural Color 432

False Color (urban) 764

Color Infrared (vegetation) 543

Agriculture 652

Healthy Vegetation 562

Land/Water 564

Natural With Atmospheric Removal 753

Shortwave Infrared 754

Vegetation Analysis 654


Zemede Mulushewa (PhD) 36
Multi-Spectral Instrument– MSI
• Sentinel 2A launched on June 23, 2015
• Sentinel 2B launched on March 7, 2017

– Constellation of two polar-orbiting


satellites in the same sun-synchronous
orbit, phased at 180° to each other

– Collaboration between the European


Commission, European Space Agency
(ESA), and European Union (EU)

– Sensor: Multi-Spectral Instrument (MSI) Sentinel 2 MSI image captured on June 30, 2018
showing the Noordoostpolder municipality in the
central Netherlands Image Credit: Copernicus, ESA
– https://sentinels.copernicus.eu/web/senti
nel/missions/sentinel-2
Zemede Mulushewa (PhD) 37
MSI…….
• Multi-Spectral Instrument (MSI)
– Spatial Resolution: 10 m (VNIR), 20 m (red
edge, SWIR), & 60 m (atmospheric bands) Aswan High Dam

– Spectral Resolution: 13 bands from VNIR to


SWIR in wavelengths from 0.44 µm to 2.2 µm

– Temporal Resolution: 5 days


– 2015 – Present
lake and reservoir
– Level-2A product provides orthorectified
Bottom-Of-Atmosphere (BOA) reflectance
– Global Coverage
• Data Acquisition (requires free registration)
Lake Nasser ( Egypt ) pivot irrigation fields, visible as circular
– https://scihub.copernicus.eu shapes in the image, with the largest having a diameter of
around 750 m.
Zemede Mulushewa (PhD) Image credit: Copernicus, ESA 38
Harmonized Landsat Sentinel-2 (HLS) – HLS
• Harmonized Landsat and Sentinel-2 (HLS)
– Combines the land surface observations
of OLI & MSI into a single data set

– Observations at 30m resolution every 2-3


days

– Atmospherically corrected and


cloud masked
– Allows for monitoring phenology,
crop condition, and management Two rivers are prominent in this year of NDVI data showing vegetation
health through the seasons in Choptank (left) the Nanticoke River (right)
– Normalized to a common nadir view is marked as an ecotourism water trail.
geometry via BRDF estimation
Farms and small towns are dotted throughout the watershed of each river
– https://hls.gsfc.nasa.gov/ (red is bare soil and green indicates healthy vegetation. (January 1 to
December 30, 2016).

Image Credit: NASA's Goddard Space Flight Center 39


Zemede Mulushewa (PhD)
Evapotranspiration
 Sum of evaporation from the land surface plus transpiration in vegetation
 Highly variable in space and time
 Critical component of the water and energy balance of climate-soil- vegetation
interactions
 Extremely useful in monitoring and assessing water availability, drought conditions,
and crop production
 Remote sensing has long been recognized as the most feasible means to provide spatially
distributed regional ET information over land surfaces
 Can not be measured directly with satellite instruments. Modeled outputs are dependent on
many variables:
 – Land surface temperature, air temperature, solar radiation, humidity, albedo, soil
conditions, and vegetation cover
 https://arset.gsfc.nasa.gov/sites/default/files/water/ET-SMAP/week1-v2.pdf

Zemede Mulushewa (PhD) 40


Evapotranspiration

Image Credit: NASA

Zemede Mulushewa (PhD) 41


Evapotranspiration from MODIS
• Moderate Resolution Imaging
Spectroradiometer (MODIS)
– MOD16: Level 4
Evapotranspiration/Latent Heat Flux
– 8-day, yearly intervals (2001 – Present
– Spatial Resolution: 500 m
– Global Coverage
– MOD16A2/A3 User’s guide

– Data acquisition:
– https://search.earthdata.nasa. gov/search?q=MOD16

Zemede Mulushewa (PhD) 42


ET from Land Data Assimilation System (LDAS)
 LDAS:
 Provides model-based ET data of which there is a global collection (GLDAS)
 Integrates satellite and ground observations within sophisticated numerical
models based on water and energy balance methods
 Spatial Resolution: 0.25-degree and 1-degree
 Temporal Resolution: 3-hourly, daily, monthly
 1948 – Present
 Data Acquisition:
https://disc.gsfc.nasa.gov/datasets?keywords=GLDAS&page=1

Zemede Mulushewa (PhD) 43


Land Surface Temperature
• Land Surface Temperature (LST) products show the temperature of the land
surface in Kelvin (K)

• Differs from air temperature measurements as it provides the temperature of


whatever is on the surface of the Earth
• (e.g. bare sand, ice- and snow-covered area, leaf-covered tree canopy, roads, etc.)

• Useful for monitoring changes in weather and climate patterns

• Used in agriculture and water resource management to allow farmers and


decision makers to evaluate water requirements

Zemede Mulushewa (PhD) 44


Precipitation/Rainfall
• Precipitation is a key component of the water cycle, and difficult to measure
since rain and snow vary greatly in both space and time

• Satellites provide frequent and accurate observations and measurements of rain and
snow around the planet, especially where ground-based data are sparse

• Radar/radiometer estimates measure the intensity and variability of latent heating


structures of precipitation systems

• Agricultural community needs to know the timing and amount of rain to forecast
crop yields as well as freshwater shortages affecting irrigation and production

Zemede Mulushewa (PhD) 45


Precipitation/Rainfall….. GPM
• Global Precipitation Measurement (GPM) mission

• Integrated Multi-satellitE Retrievals for


Global Precipitation Measurement
(IMERG) product

• 0.1º x 0.1º grid (60°S to 60°N)

• 2000 – Present

• Level 3 products for half-hourly, daily, monthly


data

• Data Access:
– https://search.earthdata.nasa.gov/search?q=IM
ERG Zemede Mulushewa (PhD) 46
Precipitation/Rainfall….. CHIRPS
• Climate Hazards Group InfraRed Precipitation with Station data (CHIRPS)
– Rainfall estimates from rain gauge and satellite observations
– Developed by Climate Hazards Center (USA, California)
– 35+ year quasi-global rainfall data set.
– Incorporates in-house climatology, 0.05° resolution satellite imagery, and
in-situ station data to create gridded rainfall time series
– 0.05° x 0.05° grid (50°S-50°N)
– 1981 – Near-Present
• Data Access:
– https://www.chc.ucsb.edu/data/chirps

Zemede Mulushewa (PhD) 47


Soil Moisture

Soil moisture sensors at four soil depths (5, 10,


20, and 50 cm) and a general depiction of soil
soil profile water movement following a rain event.
Credit: FAO
Credit: Kansas Mesonet

Zemede Mulushewa (PhD) 48


Soil-Moisture Active Passive (SMAP)
• Soil Moisture Active Passive (SMAP)
• L-band passive radiometer receives energy
in a narrow microwave band (1.41 Ghz)
• Level 4 products assimilate SMAP L-band
brightness temperature data into a land
surface model generating root zone and
surface soil moisture
• Spatial Resolution: 9 km Image Credit: NASA

• Temporal Resolution: 3-hourly, Daily

Zemede Mulushewa (PhD) 49


Land Data Assimilation Systems - LDAS
• Land Data Assimilation Systems (LDAS)
• Uses numerical models to integrate
satellite information with ground-
based data
• Output estimates of soil moisture,
temperature, and evapotranspiration
• Available: https://ldas.gsfc.nasa.gov/

Image Credit: NASA

Zemede Mulushewa (PhD) 5


0
Vegetation Greenness

Zemede Mulushewa (PhD) 51


RS for Vegetation Greenness
• Healthy vegetation absorbs blue
and red light to fuel
photosynthesis and create
chlorophyll.
• A plant with more chlorophyll will
reflect more near-infrared energy
than an unhealthy plant.
• Thus, analyzing a plant’s spectrum
of both absorption and reflection in
visible and in near-infrared
wavelengths can provide
information about the plant’s health
and productivity.
• Reflected near-infrared radiation can
be Credit: Jeff Carns, NASA

Zemede Mulushewa (PhD) 52


RS for Vegetation Greenness - NDVI
Normalized Difference Vegetation Index (NDVI)
• Based on the relationship between
red and near-infrared
wavelengths
• Advantages:
– Efficient and simple index to
identify vegetated areas
and their condition
– Reduces sun-angle,
shadow, and
topographic variation
effects
– Enables large-scale vegetation Credit: Eric Brown de Colstoun, NASA
monitoring, allowing comparison
of different regions through time
Zemede Mulushewa (PhD) 53
RS for Vegetation Greenness – NDVI…..

NIR − RED
NDVI =
NIR + RED

• Theoretically, values range from -1.0 to 1.0


– Typical range of NDVI measured from
Earth’s surface is between about −0.1
for non-vegetated surfaces and as high
as 0.9 for dense green vegetation
– Increases with increasing green
biomass, changes seasonally, and
responds to climatic conditions
Example of NDVI being calculated for healthy
green vegetation and senescing vegetation
Credit: Robert Simmon, NASA

Zemede Mulushewa (PhD) 54


Band Ratios

The spectral
signature of
healthy
vegetation has
clear peaks and
troughs captured
in sensor bands

Zemede Mulushewa (PhD) 55


Band Ratios:
Normalized Difference Vegetation Index

These indices
allow us to
distinguish
between healthy
and unhealthy
vegetation

Zemede Mulushewa (PhD) 56


Band Ratios:
Normalized Difference Vegetation Index

The NDVI for


healthy vegetation
is different than
unhealthy
vegetation

Zemede Mulushewa (PhD) 57


Band Ratios:
Normalized Difference Vegetation Index

The NDVI for


healthy vegetation
is different than
unhealthy
vegetation

Zemede Mulushewa (PhD) 58


Limitations of RS (Satellite) Data
• It is difficult to obtain high spectral, spatial, and temporal resolution with the same
instrument.
• Optical sensors cannot penetrate clouds or vegetative cover, which can lead to
data gaps or a decrease in data utility.
• Spatial Resolution:While coarse resolution data (MODIS & VIIRS) provides a synoptic
view, the spatial resolution is too coarse for field-level assessments.
• Temporal Resolution: Many satellites only pass over the same spot on Earth every 3-5
days and sometimes as seldom as every 16+ days.
• Spectral Resolution: Multispectral instruments observe reflected and emitted light in
broad wavelength ranges for a particular band, with a limited number of bands.
• Large amounts of data exists in various file formats, file sizes, and from multiple
sources
• Knowledge of data and tools is required to work with satellite data.

Zemede Mulushewa (PhD) 59


END
Zemede Mulushewa (PhD) 60
Unit 2: Application of GIS and RS for Agricultural Resource
and Watershed Management
2.1. Obtaining and processing GIS and RS data for agricultural & watershed management
2.2. Agricultural Resource Mapping and Monitoring
2.2.1. Agricultural land use classification and validation
2.2.2. Agro-Ecological Mapping
2.2.3. Crop Type Mapping and Monitoring
2.2.4. Rangeland identification and Mapping
2.2.5. Characterization and mapping of plant health and crop yield
2.3. On overview of Watershed and Management approaches
2.3.1.Definition and basic concepts of watershed
2.3.2.Principle and Approaches of Watershed Management
2.3.3.Planning watershed management
2.3.4.Characterizing the geomorphology of Watersheds
a)Watershed delineation
b) Watershed shapes and shape attributes
c)Streams & other watershed characteristics

Exercise 2 (a). Watershed delineation using terrain data of the DEM in the Arc-Hydro extension tools of the
ArcGIS Desktop software version 10.8.
Exercise 2 (b). Mapping the agro-ecological zones in their locality using the ArcGIS Desktop software
version 10.8.
Zemede Mulushewa (PhD) 61
Agricultural land use classification and validation
 Agricultural land use is very dynamic, agricultural censuses are often poorly
georeferenced and crop types are difficult to interpret directly from satellite imagery
 A 1 km2 sample of land showing the landscape
variety across the sampled sites due to the farming
system in place:
(a) rainfed cereals in Burkina Faso;
(b) rice systems in Madagascar;
(c) agropastoral systems in Brazil;
(d) mixed agriculture in Brazil;
(e) rainfed groundnut and millet agropastoral systems
in Niakhar and
(f) in Nioro, Senegal;
(g) irrigated rice systems and orchards in Cambodia;
(h) agroforestry in Kenya;
(i) mixed agriculture in South Africa

See details @ https://essd.copernicus.org/articles/13/5951/2021/essd-13-5951-


2021.html
62
Zemede Mulushewa (PhD) Images © Google Earth 2020
Recap to image classification methods

What is Image classification?

Image classification is the process of categorizing and labeling groups of pixels or


vectors within an image based on specific rules.

The categorization law can be devised using one or more spectral or textural
characteristics

Two general methods of classification are


‘supervised’ &
unsupervised’.
Image Classification
• Requires delineating boundaries of classes in
n-dimensional space using class statistics
• Each group of pixels is characterized by:
– min.
Vegetation
– max. Soil
– mean
– standard deviation
• All the pixels in the image that fall within
those statistics are given those labels

64
Image Classification …Approaches

Pixel-Based
Object-Based
• Each pixel is grouped in a class
 Pixels with common spectral characteristics
• Useful for multiple changes in land use within are first grouped together (segmentation)
a short period of time  Useful for:
• Best for complete data coverage and a need  reducing speckle noise in radar images
for methods to ensure time series high resolution imagery
consistency at the pixel level

Whiteside, T., & Ahmad, W. (2005, September). A comparison of object-oriented and pixel-based classification methods for mapping land cover in northern Australia. Proceedings of SSC2005
65
Spatial intelligence, innovation and praxis: The national biennial Conference of the Spatial Sciences Institute.
Image Classification…Methods

Supervised Unsupervised

• Uses expert-defined areas of known • Uses classification algorithms to assign


vegetation types (training areas) to tune pixels into one of a number of user-specified
parameters of classification algorithms class groupings

• Algorithm then automatically identifies and • Interpreters assign each of the groupings of
labels areas similar to the training data pixels a value corresponding to a land cover
class

Credit: David DiBiase, Penn State Department of Geography

66
Supervised vs. Unsupervised

Supervised Unsupervised

Select Training Run Clustering


Fields Algorithm

Edit/Evaluate Identify
Signatures Classes

Classify Image Edit/Evaluate


Signatures

Evaluate Evaluate
Classification Classification

67
Image Classification…Methods
Supervised Method
Supervised classification requires the analyst to select training areas where they know what is on
the ground, and then digitizes a polygon within that area
Mean Spectral
Signatures Conifer
Known Conifer
Area
This i ma ge cannot currently be displayed.

Water
Known Water This i ma ge cannot currently be displayed.

Area
This i ma ge cannot currently be displayed.

Deciduous
Known Deciduous
Area
Digital Image
Sutton, L. Image Classification. Materials from Satellite Image Classification & Change Detection at Portland State University.

68
Image Classification…Methods
Supervised Method
The spectral signature of each pixel gets matched with the training signatures and the image is
classified accordingly Mean Spectral Signatures
Information Conifer
Multispectral Image
(Classified Image) Spectral signature
This i ma ge cannot currently be displayed.

of next pixel to be
classified

Deciduous

Unknown

Water

69
Image Classification…Training Sites (or Regions of Interest)
Key Characteristics

• General rule: If using n bands of data, then >10n pixels of training data should be collected for
each class
• Size: Must be large enough to provide accurate estimates of the properties of each class
• Location: Each class should be represented by several training areas positioned throughout
the image
• Number: 5 to 10 GCPs per class minimum. You want to make sure spectral properties of each
class are represented
• Uniformity: Each training area should exhibit unimodal frequency distribution for each spectral
band.

70
Image Classification…Selecting training sites
Minimizing confusion
• Confusion of land cover classes is common in land cover
classification because:
– Land cover types are spectrally similar (i.e. different vegetation or
crop types)
– Shadows or clouds
(The white puffs with black shadows are clouds)
– Training sites are delineated too broadly OR they are not capturing Image Credit: USGS/NASA
enough variability.

These training sites


This training site better represent the
includes too many land spectral variability in
cover types and agricultural fields
therefore too much
spectral variability

Image Credit: USGS/NASA 71


Image Classification…Selecting Training Sites
Creating a polygon vs. Region growing
Region Growing
Creating a Polygon

For large areas


that are fairly
homogeneous However, in
(such as large areas that have
water bodies), you a lot of spectral
can draw your variation it is
own training site better to use a
region growing
tool

72
Image Classification…Selecting Training Sites
What is region growing?
• Region Growing Tool: create a training site
based on the spectral characteristics This training site was
• Training Site: created by including adjacent created in a vegetated
pixels within a spectral threshold that you area with a lot of spectral
choose. variation

• Threshold: based on the pixel values in the


image.
• Example:
– for an image that has been converted to If you zoom into this
reflectance values (which will range from training area, you will see
0.01 to 1), your spectral threshold might be that it is only including
0.08. certain pixels with values
that are similar to each
other

73
Image Classification…Selecting Training Sites
Region Growing Example

• The central pixel


in image a) is • The pixel values
for a single band
used as the seed
are shown in
or the starting image b)
point to grow the
region
• The training
area is created
• The training site based on the
parameters are training site
defined on the far parameters in c)
right and d)

Spectral distance = 0.1


Minimum Size = 1 pixel
Maximum ROI width = 5 pixels
74
Credit: Semi-Automatic Classification Plugin Documentation
Image Classification…Classification Algorithms

Classification Algorithms
• Used to classify the whole image by comparing spectral characteristics
of each pixel to the spectral characteristics of the training for land cover
classes

• Different available methods :


– Minimum Distance
– Maximum Likelihood
– Spectral Angle Mapping

• These methods determine different ways for the classes to be defined


based on their statistics

75
Image Classification…Classification Algorithms

Minimum Distance vs. Maximum


Minimum Likelihood
Distance: Maximum Likelihood:
• The mean value for each class and each • The mean and standard deviations for each
band is calculated class and each band are calculated
• Each pixel is assigned to a class which has • Calculates the probability that a pixel falls
the shortest Euclidean distance within a particular class

76
Techniques Description Advantages Disadvantages

Focuses on finding peak


Doesn’t require Many details can get
Thresholding values based on the
complicated pre- omitted, threshold
Method histogram of the image to
processing, simple errors are common
find similar pixels

Good for images


based on discontinuity
Edge Based having better Not suitable for noisy
detection unlike similarity
Method contrast between images
detection
objects.

Works really well for


images with a
based on partitioning an
Region-Based considerate amount of Time and memory
image into homogeneous
Method noise, can take user consuming
regions
markers for fasted
evaluation
Other advanced classifiers: machine learning options
 Highly sophisticated – generally used by more advanced image analysts are:
1. Support vector machines (SVM):
 SVM represents a group of theoretically superior machine learning algorithms
 SVMs are particularly appealing in the remote sensing field due to their ability to generalize well
even with limited training samples, a common limitation for remote sensing applications.
 See details on https://www.sciencedirect.com/science/article/abs/pii/S0924271610001140

2. Neural networks:
 It is a series of algorithms that endeavors to recognize underlying relationships in a set of data
through a process that mimics the way the human brain operates.
 See details on : https://www.sciencedirect.com/science/article/abs/pii/0098300494000826

78
Summary of remote sensing image classification techniques

Classification Characteristics Examples of classifiers


Techniques
Unsupervised (e.g. k-means, ISODATA, SOM,
hierarchical clustering)
Each pixel is assumed pure and
Pixel-based typically labeled as a single land Supervised (e.g. Maximum likelihood, Minimum
techniques use land cover type distance-to-means, Mahalanobis distance,
Parallelepiped, k-nearest Neighbors)

Machine learning (e.g. artificial neural network,


classification tree, random forests, support vector
machine, genetic algorithms)
Each pixel is considered mixed, Fuzzy classification, neural networks, regression
Sub-pixel-based and the areal proportion of each modeling, regression tree analysis, spectral mixture
techniques class is estimated analysis, fuzzy-spectral mixture analysis

Geographical objects, instead of Image segmentation and object-based image analysis


Object-based individual pixels, are considered techniques (e.g. E-cognition, ArcGIS Feature Analyst)
techniques the basic unit

see details (https://www.tandfonline.com/doi/pdf/10.5721/EuJRS20144723 )


Summary of spatio-contextual remote sensing image classification techniques

Spatio-contextual
Classification Role of spatio-contextual Classifier types
Techniques information
Structural texture extraction (e.g. mathematical
morphology techniques)
Incorporation of texture metrics can Statistical texture extraction (e.g. first-order statistics,
improve the classification accuracy second-order statistics, texture spectrum,
through mitigating spectral confusion semivariance)
Texture extraction among spectrally similar classes Model-based texture extraction (e.g. fractal models,
autoregressive models, MRFs models) Transform
texture extraction (e.g. Fourier transform, Gabor
transform, Wavelet transforms)
MRFs incorporate spatio- contextual
information into a classifier
MRFs through modifying the discriminant Integrated algorithm of MRFs and SVM Adaptive
function with an addition of spatial MRFs
correlation term.
Spatio-contextual information has been Image segmentation (e.g. region-growing,
Image incorporated in the image segmentation Markovian methods, watershed methods,
segmentation process, with each segment contains hierarchical algorithms)
and object-based spatially contiguous and homogenous Object-based image analysis techniques (e.g. SVM,
image analysis pixels, and avoids the salt-and-pepper nearest neighbor classifier)
noise.
Accuracy Assessment

Accuracy Assessment
 Because it is not practical to test every pixel in the classification image, a representative
sample of reference points in the image with known class values is used
Sources
of Errors
Ground Reference Test pixels

• Locate ground reference test pixels (or polygons if the classification is based on human visual
interpretation) in the study area.
• These sites are not used to train the classification algorithm and therefore represent unbiased
reference information.
• It is possible to collect some ground reference test information prior to the classification, perhaps at
the same time as the training data.
• Most often collected after the classification using a random sample to collect the appropriate
number of unbiased observations per class.
Sample Size

Sample size N to be used to assess the accuracy of a land-use classification map for the
binomial probability theory:

Z 2 ( p)(q)
N
E2

P - expected percent accuracy,


q = 100 – p,
E - allowable error,
Z = 2 (from the standard normal deviate of 1.96 for the 95% two-sided confidence level).
Sample Size

Example1: With the expected accuracy is 85% at an allowable error of 5% (i.e., it is 95%
accurate), the number of points necessary for reliable results is:

22 (85)(15)
N 2
 a minimum of 203 points
5

Example 2: With expected map accuracies of 85% and an acceptable error of 10%, the sample size
for a map would be 51:

22 (85)(15)
N 2
 51 points
10
Accuracy assessment “best practices”

30-50 reference points per class is ideal


 Reference points should be derived from imagery or data acquired at or near the same time
as the classified image
 If no other option is available, use the original image to visually evaluate the reference points
(effective for generalized classification schemes)
Sample Design
 There are basically five common sampling
designs used to collect ground reference test data
for assessing the accuracy of a remote sensing–
derived thematic map:

a) random sampling,
b) systematic sampling,
c) stratified random sampling,
d) stratified systematic unaligned sampling ,
e) cluster sampling.
Commonly Used Methods of Generating Reference Points

• Random: no rules are used; created using a completely random process


• Stratified random: points are generated proportionate to the distribution of classes in
the image
• Equalized random: each class has an equal number of random points
• The latter two are the most widely used to make sure each class has some points
• With a “stratified random” sample, a minimum number of reference points in each class is
usually specified (i.e., 30)
• For example, a 3 class image (80% forest, 10% urban, 10% water) & 30 reference points:
 completely random: 30 forest, 0 urban, 1 water
 stratified random: 24 forest, 3 urban, 3 water
 equalized random: 10 forest, 10 urban, 10 water
Error Matrix
• A tabular result of the comparison of the pixels in a classified image to known
reference information
• Rows and columns of the matrix contain pixel counts
• Permits the calculation of the overall accuracy of the classification, as well as the
accuracy of each class

classified image

forest shrubland grassland urban totals


reference data

forest 150 5 15 10 180


shrubland 15 55 5 5 80
grassland 10 20 105 5 140
urban 25 20 5 50 100
totals 200 100 130 70 500
Evaluation of Error Matrices

• Ground reference test information is compared pixel by pixel (or polygon by polygon

when the remote sensor data are visually interpreted) with the information in the remote

sensing–derived classification map.

• Agreement and disagreement are summarized in the cells of the error matrix.

• Information in the error matrix may be evaluated using simple descriptive statistics or

multivariate analytical statistical techniques.


Types of Accuracy
1) Overall accuracy
2) Producer’s accuracy
3) User’s accuracy
4) Kappa coefficient
Descriptive Statistics

• The overall accuracy of the classification map is determined by dividing the total correct
pixels (sum of the major diagonal) by the total number of pixels in the error matrix (N ).

• The total number of correct pixels in a category is divided by the total number of pixels of
that category as derived from the reference data (i.e., the column total).

• This statistic indicates the probability of a reference pixel being correctly classified and is a
measure of omission error.

• This statistic is the producer’s accuracy because the producer (the analyst) of the
classification is interested in how well a certain area can be classified.
Producer’s Accuracy

classified image
forest shrubland grassland urban totals
reference data forest 150 5 15 10 180
shrubland 15 55 5 5 80
grassland 10 20 105 5 140
urban 25 20 5 50 100
totals 200 100 130 70 500

 The probability a reference pixel is being properly classified


 measures the “omission error” (reference pixels improperly classified are being omitted from
the proper class)
 150/180 = 83.33%
Overall Accuracy

• The total number of correctly classified samples divided by the total number of
samples
• measures the accuracy of the entire image without reference to the individual
categories
• sensitive to differences in sample size
• biased towards classes with larger samples
Overall Accuracy

classified image
forest shrubland grassland urban totals

reference data
forest 150 5 15 10 180
shrubland 15 55 5 5 80
grassland 10 20 105 5 140
urban 25 20 5 50 100
totals 200 100 130 70 500

 Based on how much of each image class was correctly classified


 total number of correctly classified pixels divided by the total number of pixels in the matrix
 (150+55+105+50)/500 = 360/500 = 72%
Descriptive Statistics

• If the total number of correct pixels in a category is divided by the total number

of pixels that were actually classified in that category, the result is a measure of

commission error

• This measure, called the user’s accuracy or reliability, is the probability that a

pixel classified on the map actually represents that category on the ground
User’s Accuracy

classified image
forest shrubland grassland urban totals
reference data
forest 150 5 15 10 180
shrubland 15 55 5 5 80
grassland 10 20 105 5 140
urban 25 20 5 50 100
totals 200 100 130 70 500

 The probability a pixel on the map represents the correct land cover category

 measures the “commission error” (image pixels improperly classified are being committed to

another reference class)

 150/200 = 75%
Kappa Analysis

Khat Coefficient of Agreement:

• Kappa analysis yields a statistic, K̂ , which is an estimate of Kappa.

• It is a measure of agreement or accuracy between the remote sensing–derived classification

map and the reference data as indicated by:

a) the major diagonal, and

b) the chance agreement,

 which is indicated by the row and column totals (referred to as marginals).


Kappa Coefficient
• Expresses the proportionate reduction in error generated by the classification in

comparison with a completely random process.

• A value of 0.82 implies that 82% of the errors of a random classification are being avoided

• The Kappa coefficient is not as sensitive to differences in sample sizes between classes

 therefore considered a more reliable measure of accuracy; Kappa should always be

reported

 A Kappa of 0.8 or above is considered a good classification; 0.4 or below

is considered poor
Kappa Coefficient
r r
M  nij  n n i j
ˆ i j 1 i j 1
K r
M2 n n i j
i j 1

Where:
r = number of rows in error matrix
nij = number of observations in row i, column j
ni  = total number of observations in row i
nj = total number of observations in column j
M = total number of observations in matrix
Kappa Coefficient

classified image
forest shrubland grassland urban totals

reference data
forest 150 5 15 10 180
shrubland 15 55 5 5 80
grassland 10 20 105 5 140
urban 25 20 5 50 100
totals 200 100 130 70 500

r r
M  nij  n n i j
ˆ i j 1 i j 1 (500  360)  [(180  200)  (80  100)  (140  130)  (100  70)]
K 
r
500  [(180  200)  (80  100)  (140  130)  (100  70)]
n n
2

M  2
i j
i j 1 180,000  69,200 110,800
 
250,000  69,200 180,800
 0.613


You might also like