You are on page 1of 41

PROJECT STAGE-I REPORT

TITLE: Satellite Image Processing for detection of


Environmental Change Pattern

Group No. : 039


Srayee Banik (1952071)
Hriddha Bhowmik (1952078)
Rashi Sharraf (1952095)
Olivea Roy(1952098)

Under Supervision of
Prof.(Dr.) Anindya Sen

Department of Electronics and Communication Engineering

Heritage Institute of Technology

1
CONTENTS

SL. No Topics Page No.


Satellite Image Processing 3
Environmental Change Pattern 4
Ch 1. Global Warming 4-5
Ch 2. Change in Temperature 6-7
Ch 3. Ozone Layer Depletion 7-8
Ch 4. Melting Of Icecaps 8-10
Ch 5. Rise in Sea Level 11-18
Ch 5.1. How do we measure global sea-level? 11-13
Ch 5.2. Sea-Level rise due to ice melting 14-17
Ch 5.3. Sea-level rise due to thermal expansion 17-18
Ch 6. Uneven distribution of rainfall 19-26
Ch 6.1 Drought 22-24
Ch 6.2 Floods 24-26
Ch 7. Tsunami 26-36
Ch 8. Change in Forest Cover 37-39
Acknowledgement 40
References 41

2
Satellite Image Processing

Satellite Image Processing is an important field in research and development and


consists of the images of earth and satellites taken by the means of artificial satellites.
Firstly, the photographs are taken in digital form and later are processed by the
computers to extract the information. Statistical methods are applied to the digital
images and after processing the various discrete surfaces are identified by analyzing the
pixel values.The satellite imagery is widely used to plan the infrastructures or to
monitor the environmental conditions or to detect the responses of upcoming
disasters.In broader terms we can say that the Satellite Image Processing is a kind of
remote sensing which works on pixel resolutions to collect coherent information about
the earth surface.
Majorly there are four kinds of resolutions associated with satellite imagery. These are:
1.Spatial resolution –
It is determined by the sensors Instantaneous Field of View (IFoV) and is defined as
the pixel size of an image that is visible to the human eye being measured on the ground.
Since it has high resolving power or the ability to separate and hence is termed as Spatial
Resolution.
2.Spectral resolution –
This resolution measures the wavelength internal size and determines the number of
wavelength intervals that the sensor measures.
3.Temporal resolution –
The word temporal is associated with time or days and is defined as the time that passes
between various imagery cloud periods.
4.Radiometric resolution –
This resolution provides the actual characteristics of the image and is generally
expressed in bits size. It gives the effective bit depth and records the various levels of
brightness of imaging system.
Thus, Satellite Image Processing has huge amount of applications in research and
development fields, in remote sensing, in astronomy and now even in cloud computing
on a large scale.

3
Environmental Change Pattern

Rising global average temperature is associated with widespread changes in weather


patterns. Scientific studies indicate that extreme weather events such as heat waves and
large storms are likely to become more frequent or more intense with human-induced
climate change.
Long-term changes in climate can directly or indirectly affect many aspects of society
in potentially disruptive ways. For example, warmer average temperatures could
increase air conditioning costs and affect the spread of diseases like Lyme disease, but
could also improve conditions for growing some crops. More extreme variations in
weather are also a threat to society. More frequent and intense extreme heat events can
increase illnesses and deaths, especially among vulnerable populations, and damage
some crops. While increased precipitation can replenish water supplies and support
agriculture, intense storms can damage property, cause loss of life and population
displacement, and temporarily disrupt essential services such as transportation,
telecommunications, energy, and water supplies.

The change pattern may include some very small to some very big changes like :
a. Global warming and change in temperature
b. Ozone layer depletion
c. Melting of icecaps
d. Rise in sea level
e. Natural disasters
f. change in forest cover

Chapter1 : Global Warming

Global warming is the long-term heating of Earth’s surface observed since the pre-
industrial period (between 1850 and 1900) due to human activities, primarily fossil fuel
burning, which increases heat-trapping greenhouse gas levels in Earth’s atmosphere.
This term is not interchangeable with the term "climate change."

Since the pre-industrial period, human activities are estimated to have increased Earth’s
global average temperature by about 1 degree Celsius (1.8 degrees Fahrenheit), a
number that is currently increasing by more than 0.2 degrees Celsius (0.36 degrees
Fahrenheit) per decade. The current warming trend is unequivocally the result of human
activity since the 1950s and is proceeding at an unprecedented rate over millennia.

Global warming occurs when carbon dioxide (CO2) and other air pollutants collect in
the atmosphere and absorb sunlight and solar radiation that have bounced off the earth’s
surface. Normally this radiation would escape into space, but these pollutants, which
can last for years to centuries in the atmosphere, trap the heat and cause the planet to
get hotter. These heat-trapping pollutants—specifically carbon dioxide, methane,
nitrous oxide, water vapor, and synthetic fluorinated gases—are known as greenhouse
gases, and their impact is called the greenhouse effect.

4
Though natural cycles and fluctuations have caused the earth’s climate to change
several times over the last 800,000 years, our current era of global warming is directly
attributable to human activity—specifically to our burning of fossil fuels such as coal,
oil, gasoline, and natural gas, which results in the greenhouse effect. In the United
States, the largest source of greenhouse gases is transportation (29 percent), followed
closely by electricity production (28 percent) and industrial activity (22 percent). Learn
about the natural and human causes of climate change. Curbing dangerous climate
change requires very deep cuts in emissions, as well as the use of alternatives to fossil
fuels worldwide. The good news is that countries around the globe have formally
committed—as part of the 2015 Paris Climate Agreement—to lower their emissions by
setting new standards and crafting new policies to meet or even exceed those standards.
The not-so-good news is that we’re not working fast enough. To avoid the worst
impacts of climate change, scientists tell us that we need to reduce global carbon
emissions by as much as 40 percent by 2030. For that to happen, the global community
must take immediate, concrete steps: to decarbonize electricity generation by equitably
transitioning from fossil fuel–based production to renewable energy sources like wind
and solar; to electrify our cars and trucks; and to maximize energy efficiency in our
buildings, appliances, and industries.

5
Chapter2 : Change in Temperature

Satellite temperature measurements are inferences of the temperature of


the atmosphere at various altitudes as well as sea and land surface temperatures
obtained from radiometric measurements by satellites. These measurements can be
used to locate weather fronts, monitor the El Niño-Southern Oscillation, determine the
strength of tropical cyclones, study urban heat islands and monitor the global
climate. Wildfires, volcanos, and industrial hot spots can also be found via thermal
imaging from weather satellites. They can also be used as part of instrumental
temperature records of Earth's climate system.
Weather satellites do not measure temperature directly. They measure radiances in
various wavelength bands. Since 1978 microwave sounding units (MSUs) on National
Oceanic and Atmospheric Administration polar orbiting satellites have measured the
intensity of upwelling microwave radiation from atmospheric oxygen, which is
related to the temperature of broad vertical layers of the atmosphere. Measurements
of infrared radiation pertaining to sea surface temperature have been collected since
1967.Satellite sensors that measure infrared radiation infer the amount of heat emitted
from an object at the Earth’s surface.
The four main types of temperature sensors are:

a. Thermocouples

Thermocouples are the most commonly used type of temperature sensor. They are
used in industrial, automotive, and consumer applications. Thermocouples are self-
powered, require no excitation, can operate over a wide temperature range, and have
quick response times.

Thermocouples are made by joining two dissimilar metal wires together. This causes
a Seebeck Effect. The Seebeck Effect is a phenomenon in which a temperature
difference of two dissimilar conductors produces a voltage difference between the two
substances.1 It is this voltage difference that can be measured and used to calculate
the temperature.Some disadvantages of thermocouples include the fact that measuring
temperature can be challenging because of their small output voltage, which requires
precise amplification, susceptibility to external noise over long wires, and cold
junction. Cold junction is where thermocouple wires meet copper traces of the signal
circuitry. This creates another Seebeck Effect which needs to be compensated for
called cold junction compensation.

b. RTD (Resistance Temperature Detector)


As temperature changes, the resistance of any metal changes as well. This difference
in resistance is what RTD temperature sensors are based on. An RTD is a resistor with
well-defined resistance vs. temperature characteristics. Platinum is the most common
and accurate material used to make RTDs.

6
c. Thermistors
Thermistors are similar to RTDs in that temperature changes cause measurable
resistance changes. Thermistors are usually made from a polymer or ceramic material.
In most cases, thermistors are cheaper but are also less accurate than RTDs. Most
thermistors are available in two wire configurations.
d. Semiconductor based ICs
Semiconductor based temperature sensor ICs come in two different types: local
temperature sensor and remote digital temperature sensor. Local temperature sensors
are ICs that measure their own die temperature by using the physical properties of a
transistor. Remote digital temperature sensors measure the temperature of an external
transistor.

Chapter3 : Ozone layer depletion

The ozone layer is mainly found in the lower portion of the earth’s atmosphere. It has
the potential to absorb around 97-99% of the harmful ultraviolet radiations coming
from the sun that can damage life on earth. If the ozone layer was absent, millions of
people would develop skin diseases and may have weakened immune systems.
However, scientists have discovered a hole in the ozone layer over Antarctica. This
has focussed their concern on various environmental issues and steps to control them.
The main reasons for the ozone hole are chlorofluorocarbons, carbon tetrachloride,
methyl bromide and hydrochlorofluorocarbons
Ozone layer depletion is the thinning of the ozone layer present in the upper
atmosphere. This happens when the chlorine and bromine atoms in the atmosphere
come in contact with ozone and destroy the ozone molecules. One chlorine can
destroy 100,000 molecules of ozone. It is destroyed more quickly than it is created.
Some compounds release chlorine and bromine on exposure to high ultraviolet light,
which then contributes to ozone layer depletion. Such compounds are known as
Ozone Depleting Substances (ODS).The ozone-depleting substances that contain
chlorine include chlorofluorocarbon, carbon tetrachloride, hydrochlorofluorocarbons,
and methyl chloroform. Whereas, the ozone-depleting substances that contain
bromine are halons, methyl bromide, and hydro bromofluorocarbons.
Chlorofluorocarbons are the most abundant ozone-depleting substance. It is only
when the chlorine atom reacts with some other molecule, it does not react with ozone.

Monitoring of the ozone layer has increased significantly since the 1980s when
the Antarctic ozone hole was first discovered by the British Antarctic Survey. The
ozone layer is monitored both by satellites and ground-based resources that are

7
dedicated to observing the destruction of stratospheric ozone. The main satellite that
monitors the ozone layer is the TOMS (Total Ozone Mapping Spectrometer) satellite.
The TOMS satellite measures the ozone levels from the back-scattered sunlight in
the ultraviolet (UV) range. Another satellite is NASA's UARS (Upper Atmosphere
Research Satellite) which was launched in September 1991. This satellite is unique
because it was configured to not only measure ozone levels, but also levels of ozone-
depleting chemicals. GOME, launched in April 1995 on the ERS-2 satellite, marked
the beginning of a long-term European ozone monitoring effort. Scientists receive
high quality data on the global distribution of ozone and several other climate-
influencing trace gases in the Earth's atmosphere.

In 1987, Canada became the first country in the world to focus on the Arctic ozone
layer, following the discovery of the ozone hole over the Antarctic. A cross-country
network of monitoring stations has kept continuous watch on Canada’s ozone
layer for more than three decades. The existence of these early records, before any
major human influence on the upper atmosphere, is vital to understanding the changes
that have occurred in the ozone layer.In the UK, stratospheric ozone levels are
monitored every winter and spring at Cambourne in Cornwall and Lerwick in the
Shetland Isles.

When NASA launched the Nimbus-4 satellite 50 years ago, nobody knew the ozone
layer over Antarctica was thinning. And nobody knew that chlorofluorocarbons
(CFCs)—long-lived chemicals that had been used in refrigerators and aerosol sprays
since the 1930s—were responsible.But the mission included a sensor called
the Backscatter Ultraviolet (BUV) experiment capable of measuring ozone
nonetheless.Developed by a team led by NASA’s Donald Heath, BUV was the first
space-based instrument to measure the total amount of ozone in the Earth’s
atmosphere.Fortunately, the launch went smoothly. The BUV performed well and
demonstrated a new way to measure total column ozone. This led to the Total Ozone
Mapping Spectrometer (TOMS) on Nimbus-7. The ozone data it collected gave
researchers baseline measurements that, in the mid-1980s, helped them recognize that
a troubling hole in the ozone layer had opened up.
The global recognition of the destructive potential of CFCs soon led to the 1987
Montreal Protocol, a treaty phasing out the production of ozone-depleting chemicals.It
was proposed to stop the use, production and import of ozone-depleting substances
and minimise their concentration in the atmosphere to protect the ozone layer of the
earth.

-TOMS

8
Chapter 4: Melting of Icecaps

A case study from the Himalayas(Chhota Shigri):


The Himalayas are the youngest and highest mountains of the world and have the
largest concentration of glaciers outside the polar caps, with glacier coverage of
33,000 square kilometers. The region is aptly called the “Water Tower of Asia” as it
provides around 8.6×106 m3 of water annually. Glaciers in the Himalayas feed many
important rivers of Asia including Ganga, Amu Darya, Indus, Brahmaputra,
Irrawaddy, Salween, Mekong, Yangtze, Yellow, and Tarim. Apart from feeding the
rivers, the Himalayas also play a significant role on the meteorological condition of
India. As it is well established that the climate is changing, the same is reflected from
the glacier behaviors in terms of size, health and runoff.
To study the effect of change in climate on glacier length, Chhota Shigri glacier was
chosen, as it is located in climatically important region of the Himalayas. The glacier
lies in the Chandra basin that is a sub basin of Chenab Basin and comes under
Himachal Pradesh, India. Chhota Shigri was selected as the benchmark glacier in the
HKH region by the International Commission of Snow and Ice (now this is
Commission of Cryospheric Science) in 2002. The glacier has been monitored and
studied by many glaciologists. Continuous field mass balance measurement was
carried out on the glacier by a joint team of Indian and French researchers from 2002-
2007[14]. The glacier has shown negative mass balance for last 20 years [15]. The
cumulative specific mass balance of Chhota Shigri glacier from 1986-1989 was -0.21
m.w.e..
The change in the length and in area of the Chhota Shigri glacier was studied from
1962 to 2008 using various data sources, including Toposheet maps and satellite
images from different sensors. This study was done using satellite images obtained
from 1999-2008 from Indian remote sensing satellite (IRS) images, images from
ASTER, and a 1963 Survey of India topographic map (1:50,000 scale) of Chhota
Shigri glacier.
LISS III obtains high-resolution, 23.5 meters, images of the Earth in four different
wavelengths of the electromagnetic spectrum, ranging from visible to short wave
infrared (0.52µm – 1.75 µm). AWiFS sensor is an improved version compared to the
WiFS sensor flown in IRS-1C/1D. AWiFS operates in four spectral bands identical to
LISS-III, with a spatial resolution of 56 meters. ASTER (Advanced Space borne
Thermal Emission and Reflection Radiometer) is an imaging instrument flying on
Terra, a satellite launched in December 1999 as part of NASA’s Earth Observing
System (EOS). It obtains high-resolution, 15 to 90 square meters, images of the Earth
in 14 different wavelengths of the electromagnetic spectrum, ranging from visible to
thermal infrared.
According to the satellite images study, the glacier has retreated at a rate of 15 meters
per year from 1999-2008 (Figure 3a). The average loss in glacier area from 1999-
2008 was 0.0215 square kilometers, with a standard deviation of 0.0179. Figure 3b
shows the trend in the change in the glacier area from 1962 -2008. There is a
decreasing trend in the glacier area with r 2=0.95. The glacier has been in the
continuous state of retreat as studied by the remote sensing data.

9
Conclusion: A glacier will happily advance in a healthy climate and retreat in
response to a warmer climate. We have seen from the change in the length of the
Chhota Shigri glacier for 46 years, from 1962-2008, that the glacier has retreated
significantly. This change in the length is due to the change in temperature and the
snowfall pattern in the Himalayan region. The glacier length change study also is
important for the melt and runoff modeling purpose, which again is done for the
hydrological study.

10
Chapter 5: Rise in sea level

➢ Sea level rise is caused primarily by two factors related to global warming: the
added water from melting ice sheets and glaciers, and the expansion of
seawater as it warms.

➢ Ice melting + Thermal expansion = Total sea level rise

5.1: How do we measure global sea level rise?


Tide gauges measure relative sea level change at points along the coast, while satellite
instruments measure absolute sea level change over nearly the entire ocean surface.
Many tide gauges have collected data for more than 100 years, while satellites have
collected data since the early 1990s.
Altimetry:
Radar satellite altimetry provides global, frequent, and precise measurements of
uniform accuracy of the sea level height related to a desired geodetic reference frame
at different time epochs and from various altimeter sensors.
Designed in 1969 at the Williamstown Conference on Solid Earth and Ocean Physics,
the technology was developed through the experimental missions Skylab,
Geodynamics Experimental Ocean Satellite 3 (GEOS-3), and SEAfaring SATellite
(SEASAT,). Since the early 1990s, different altimeter satellite missions provide
reliable and solid information on the sea level thus enabling various applications in
geodesy, oceanography, glaciology, climate research, atmosphere, wind, waves,
biology, and navigation.
The satellite flies in an orbit at a certain altitude S from the theoretical reference
ellipsoid. The altimeter on board the satellite emits a radar wave and analyses the
return signal that bounces off the surface. The time it takes for the signal to make the
trip from the satellite to the surface and back again, defines the satellite-to-surface
range R. In other words, the range is the actual distance between the satellite and the
moving sea surface. The sea surface height (SSH) at any location or point in time is a
deviation from the stable reference ellipsoid. The sea surface height is, thus, defined
as the difference between the satellite's position with respect to the reference ellipsoid,
and the satellite-to-surface range. That is SSH = S – R.

11
Overall, the development of the satellite altimetry can be divided into three phases –
(1) experimental, (2) modern, and (3) future phase.

Altimeter satellite missions’ timeline overview divided into an experimental era


(yellow), modern era (green), and future altimetry era (blue) along with the missions’
orbit reportativity and information about their countries of origin.

At present, several satellites are providing measured altimeter data:

▪ Cryogenic Satellite (CryoSat)-2 designed and built by ESA and launched in 2010,
▪ Haiyang (HY)-2a approved and led by China National Space Administration
(CNSA) launched in 2011,
▪ SARAL launched in 2013 as a cooperative mission between the Indian Space
Research Organization (ISRO) and CNES,
▪ Sentinel-3 launched in 2015 by ESA and operated by EUMETSAT,
▪ Jason-3 designed in collaboration of the NASA and ESA as the successor of
TOPEX/Poseidon and Jason 1/2,
▪ Haiyang (HY)-2b launched as the second in the series of Chinese Haiyang
satellites in 2018,
▪ and Sentinel-6 Michael Freilich (previously referred to as Jason CS) launched in
late 2020, which continues the EU Copernicus and NASA program and previous
TOPEX/Poseidon and Jason 1/2/3 satellite missions.

12
Sentinel-6 satellite mission is currently in its commissioning phase, i.e., in the
calibration/validation phase. Given figure presents Sentinel-6 sea-level anomaly
derived from ‘Short Time Critical Level 2 Low Resolution’ data, overlaid on a map
showing similar products from the other Copernicus altimetry missions: Jason-3,
Sentinel-3A, and Sentinel-3B.

Early Sentinel-6 measurements validation comparing to Jason-3, sentinel-3A, and


sentinel-3B

The characteristics of previous and current satellite missions are given in the
following table:

13
5.2: Sea Level Rise due to Ice Melting:

"Earth is losing a huge amount of ice to the ocean annually, and these new results will
help us answer important questions in terms of both sea rise and how the planet's cold
regions are responding to global change," said onetime University of Colorado
Boulder physics professor John Wahr, who helped lead a study of the ice between
2003 and 2010.

GRACE (Gravity Recovery and Climate Experiment) and GRACE-FO (GRACE


follow on) satellites are used by NASA to observe how much water is being lost from
an ice glaciers and movement of water on earth.

GRACE-FO is a successor to the original GRACE mission, which operated from


April 2002 through July 2017. Since June 2018, GRACE-FO carries on the extremely
successful work of its predecessor while testing a new technology designed to
dramatically improve the already remarkable precision of its measurement system.

The GRACE missions measure small month-to-month variations in gravity over


Earth's surface arising from the constant redistribution of mass. Changes in how mass
are distributed within and between Earth’s atmosphere, oceans, groundwater and ice
sheets are fundamental indicators of the large-scale dynamics of the planet.
Monitoring changes in ice sheets and glaciers, near-surface and underground water
storage, the amount of water in large lakes and rivers, as well as changes in sea level
and ocean currents provides an integrated global view of how Earth’s water cycle and
energy balance are evolving — measurements that have far-reaching impact on our
understanding of the Earth system and important applications for everyday life. In
addition to tracking mass change, each of the two satellites uses GPS antennas to
supply at least 200 profiles of atmospheric temperature distribution and water vapor
content daily to aid in the monitoring and forecasting of weather.

Artist's view of the GRACE Follow-on mission

14
GRACE-FO carries six instruments, K/Ka-Band Ranging (KBR), Laser Ranging
Interferometer (LRI), Accelerometer (ACC), Star Camera Assembly (SCA) and Tri-
GNSS (GPS+Galileo+GLONASS), as well as Radio Occultation receiver (TriG-RO).
KBR, also referred to as Microwave Instrument (MWI), is a heritage instrument of the
GRACE mission and provides precise Satellite to Satellite Tracking (SST) in low-low
orbit. LRI measures the same range fluctuations as KBR but with reduced noise and
provides accurate measurements of the relative pointing of the two satellites. ACC
provides better thermal stability and characterization. TriG provides Precise Orbit
Determination (POD), Global Navigation Satellite System (GNSS) Radio Occultation,
neutral atmosphere, ionosphere and scintillation and GNSS reflections.

15
➢ Greenland Ice Mass Loss:

16
➢ Antarctic Ice Mass Loss:

5.3: Sea Level Rise due to Thermal Expansion:

The warming of Earth is primarily due to accumulation of heat-trapping greenhouse


gases, and more than 90 percent of this trapped heat is absorbed by the oceans. As this
heat is absorbed, ocean temperatures rise and water expands. This thermal
expansion contributes to an increase in global sea level.

Using measurements from Argo profiling floats, we know this warming has
continued, causing roughly one-third of the global sea-level rise observed by
satellite altimeters since 2004.

Argo is an international program that measures water properties across the world’s
ocean using a fleet of robotic instruments that drift with the ocean currents and move
up and down between the surface and a mid-water level. Each instrument (float)
spends almost all its life below the surface. The name Argo was chosen because the
array of floats works in partnership with the Jason earth observing satellites that
measure the shape of the ocean surface. The data that Argo collects describes the
temperature and salinity of the water and some of the floats measure other properties

17
that describe the biology/chemistry of the ocean. At present (2020) Argo is collecting
12,000 data profiles each month (400 a day). This greatly exceeds the amount of data
that can be collected from below the ocean surface by any other method. Argo plans
to continue its data collection for as long as those data remain a vital tool for a wide
range of ocean applications of which understanding and predicting climate change is
but one.

Each Argo float (costing between $20,000 and $150,000 depending on the individual
float’s technical specification) is launched from a ship. The float’s weight is carefully
adjusted so that, as it sinks, it eventually stabilizes at a pre-set level, usually 1 km.
Ten days later, an internal battery-driven pump transfers oil between a reservoir inside
the float and an external bladder. This makes the float first descend to 2km and then
return to the surface measuring ocean properties as it rises. The data and the float
position are relayed to satellites and then on to receiving stations on shore. The float
then sinks again to repeat the 10 day cycle until its batteries are exhausted.

18
Chapter 6:Uneven distribution of Rainfall:

Rainfall is a form of precipitation. It is a natural phenomenon by which water


droplets come back to the earth and evaporate to form clouds. Rainfall is the
major source of freshwater that maintains the water cycle.
When this rainfall occurs adequately, it seems a blessing to living creatures on the
earth. But, if it occurs unevenly, lots of destructions may come. For example,
very little or no rainfall for a prolonged period leads to drought, while too much
rainfall in a particular area for a long tenure leads to flood. Both droughts and
floods are natural calamities that cause huge losses to the country’s life, property,
and economy.

Causes Of Uneven Rainfall:

1. The direction of Winds – The places located on the windward side of the
mountains receive more rainfall because when the moisture-laden winds
strike the windward side of the mountains, it causes rainfall. On the other
hand, the places located on the leeward side of the mountains get scarce
rainfall.
2. A monsoon trough is a trough of low pressure that leads to
unpredictability of the monsoon in different regions, resulting in uneven
rainfall. The monsoon trough is also known as the Intertropical
Convergence Zone. The trough and its axis keep on moving, northward or
southward, which determines the rainfall distribution over a certain region.
The plains receive rainfall when the trough lies over this region. When the
axis shifts towards the Himalayas, the plains become dry and widespread
rainfall occurs in the mountainous region.
3. Distance from the Sea: Places near the coastal region get more rainfall
than the places situated far away from the sea. Cyclonic disturbances
caused due to depression: The occurrence of cyclonic depressions, which
originate over the sea, generally cross the coastal areas, cause heavy and
widespread rain in those areas.
4. Geographical reasons: Due to latitudinal extent, the coastal areas,
tropical regions, and plains receive more rainfall than the plateau and
deserts.
5. The rate of evaporation also affects the rainfall to a large extent. More
water vapours get evaporated, more will be the precipitation.
6. Movement of Clouds – Clouds affect both weather and climate as they are
the key elements of the water cycle. The precipitation conditions over a
particular area depend upon the type and amount of clouds because clouds
carry moisture from one place to another.
7. Global Warming – Due to global warming, as temperatures rise and the
air becomes warmer, more water vapour evaporates from land and water
into the atmosphere. More moisture in the air generally means more
precipitation in rainfall and snow and more heavy downpours. The melting
of Polar ice caps is also a significant result of global warming that
increases the sea level. This plays a vital role in bringing floods.

19
Uneven Rainfall Effects;Due to several reasons for uneven rainfall, as discussed
above, we notice many undesirable effects. Though rainfall is a boon to us if it
occurs unevenly, it may lead to adverse effects, as discussed below:

✓ Drought
✓ Flood
✓ Impact on agriculture
✓ Landslide

Satellites Measuring Rainfall:Earth-observing satellites can provide frequent


estimates of precipitation at a global scale. To do this, satellites carry instruments
designed to observe specific atmospheric characteristics such as cloud temperatures
and precipitation particles, or hydrometeors. These data are extremely useful for
filling in data gaps that exist between rain gauge and ground-based radar sites and
offer insights into when, where, and how much precipitation is falling worldwide.
Satellite data also provide a unique vantage point. While ground-based instruments
can directly measure or estimate how much precipitation falls to the ground, satellite
instruments estimate the amount of electromagnetic radiation (or energy) that is
emitted or reflected either from the tops of the clouds or from the rain droplets
themselves, providing a top-down view. Spaceborne radar instruments can even
observe the three-dimensional structure of precipitation. Such satellite observations
are detailed enough to allow scientists to distinguish between rain, snow, and other
precipitation types, as well as observe the structure, intensity, and dynamics of storms.

TRMM:The Tropical Rainfall Measurement Mission (TRMM), a joint mission


between NASA and the Japan Aerospace Exploration Agency (JAXA), was launched
in 1997. TRMM measured heavy to moderate rainfall over tropical and subtropical
regions for over 17 years, until the mission ended in April 2015. Measurements from
TRMM advanced our understanding of tropical rainfall, particularly over the ocean,
and provided three-dimensional images of storm intensity and structure from space
using the first satellite-borne weather radars.

Image showing TRMM's Precipitation Radar (PR) and the TRMM Microwave Imager
(TMI) instrument resolving the intensifying thunderstorms near a tropical cyclone
Magda’s eyewall off the northwest coast of Australia on January 21st, 2010.

20
The Tropical Rainfall Measuring Mission (TRMM), launched by NASA and JAXA in
1997, used both active and passive microwave instruments to measure rainfall in the
tropics. It also provided a foundation for merging rainfall information from other
satellites. TRMM showed the importance of taking observations from a non-Sun-
synchronous orbit at different times of the day, between observations by polar
orbiting sensors at fixed times of the day, to improve near real-time monitoring of
hurricanes and accurate estimates of rainfall accumulation over time. The GPM Core
Observatory continues this sampling from a non-Sun-synchronous orbit and extends
coverage to higher latitudes to provide a near global view of precipitation.

GPM:
TRMM’s successor is another joint NASA-JAXA mission called the Global
Precipitation Measurement (GPM) Core Observatory, launched on February 28, 2014
from the Tanegashima Space Center, in Japan. The Core Observatory carries two
instruments—the Dual-frequency Precipitation Radar (DPR) and GPM Microwave
Imager (GMI)—collecting observations that allow scientists to dissect storms. Like a
diagnostic CAT scan, the DPR provides a three-dimensional profile that shows the
intensities of liquid and solid precipitation. The GMI provides a two-dimensional
view to look in depth at light rain to heavy rain and falling snow—like an X-ray. The
Core Observatory is part of an international constellation of domestic and
international satellites that together provide global observations of precipitation from
space—called the GPM mission. Together, the constellation observes rain, snow, and
other precipitation data worldwide every three hours.

This diagram illustrates the dimensions covered by the GPM Core Observatory's
Dual-frequency Precipitation Radar (DPR) and GPM Microwave Imager (GMI)
instruments.

21
The GPM Core Observatory design is an extension of TRMM’s highly successful
rain-sensing package, which focused primarily on heavy to moderate rain over
tropical and subtropical oceans. Since light rain and falling snow account for
significant fractions of precipitation occurrences in middle and high latitudes, a key
advancement of GPM over TRMM is the extended capability to measure light rain (<
0.5 mm hr-1), solid precipitation and the microphysical properties of precipitating
particles. This capability drives the designs of both the active and passive microwave
instruments on GPM. The Core Observatory acts as a reference standard for the
precipitation estimates acquired by the GPM constellation of sensors.

GPM Core Observatory Satellite


The GPM Core Observatory carries the first space-borne Ku/Ka-band Dual-frequency
Precipitation Radar (DPR) and a multi-channel GPM Microwave Imager (GMI). The
DPR instrument consists of a Ka-band precipitation radar (KaPR) operating at 35.5
GHz and a Ku-band precipitation radar (KuPR) operating at 13.6 GHz. DPR provides
three-dimensional measurements of precipitation structure and characteristics. The
DPR originally collected data over a swath of 78 and 152 miles (125 and 245km) for
the Ka and Ku band radars respectively, but since May 2018 the swath now
covers 152 miles (245 km) for both radars. Relative to the TRMM precipitation radar,
the DPR is more sensitive to light rain rates and snowfall. In addition, simultaneous
measurements by the overlapping of Ka/Ku-bands of the DPR can provide new
information on particle drop size distributions over moderate precipitation intensities.

6.1:DROUGHT: Drought is a slow-onset natural hazard, mostly related to decline in


the amount of rainfall in an area for a period of time (a couple of months, season and
even years).
Somalia, the country in East Africa in a region referred to as Horn of Africa is in a
civil war, conflict and unrest since 1991. The prolonged conflict increases the impact
of droughts in the country which is already prone to drought continuously. This study
focused on assessing and monitoring drought in Somalia during the study period
2000-2011, especially drought of 2010-2011 which became a disaster. MODIS NDVI
satellite data was mainly used to achieve the purpose of the study in companion with
rainfall data from Climatic Research Unit (CRU) 3.1 and Rainfall Estimation (RE)
2.0. The analysis was done at pixel-scale in seven locations as well as at regional-
scale. The most important result is that among all drought years during the study
period, drought of 2010-11 was the most influence one in its duration (more than a
year), intensity (30%-50% negative change) and extension to most parts of southwest
Somalia and north and northeast of Kenya.
Remotely drought indices include vegetation indices based on vegetation reflectance
such as the Normalized Difference Vegetation index (NDVI).NDVI has been studied
intensively and it considers the most popular vegetation index.In current study NDVI
was used due to its availability, relatively easy to use and is available through
websites from different sensors e.g. NOAA AVHRR since 1981 and MODIS since
2000.It is the normalized difference of the reflectance between the near infrared (NIR)
and visible red channel. Healthy vegetation absorbs radiation relatively high in the
visible red band and reflects significantly high in the NIR band in the so-called
Photosynthetically Active Radiation (PAR) region of the electromagnetic spectrum
NDVI measures the changes in chlorophyll content. It has the following formula:
NDVI=(NIR-RED)/(NIR+RED)

22
Where NIR and RED, are the spectral reflectance in the NIR infrared channel and red
visible color channel, respectively, of the satellite sensor . NDVI has values ranging
from -1 to 1; that represents the greenness of the vegetation canopy and it is always
that vegetation takes values 0.1 to 1 while water, cloud, snow and other land covers
tend to take lower or negative values. Higher NDVI values close to 1 reflect healthy
vegetation or higher vegetation (denser) while lower values close to 0.1 indicate
stressed or less vegetation (less greenness).
MODIS NDVI:
Moderate Resolution Imaging Spectroradiometer (MODIS) sensor is onboard of
NASA’s Terra and Aqua platform satellites, launched in December 18, 1999 and on
May 4, 2002 respectively (NASA, 2011). Terra and Aqua are sun-synchronous
satellites with polar orbits pass the earth from north to south with 98º inclination and
ascend across the equator in the same time locally in the morning for Terra and in
afternoon for Aqua. MODIS is considered to be a valuable resource for monitoring
the ecosystem and environment (LP DAAC, 2010b). It provides images in 36 spectral
bands between 0.405 and 14.385 µm for the whole earth, every one to two days with a
viewing swath width of 2,330 km (NASA, 2011). MODIS from both Terra and Aqua
provides series of global products (land, atmosphere and oceans) at different spatial
resolutions 250, 500, 1000 and 5600 m in HDF-EOS (Hierarchical Data Format-Earth
Observing System) format.
MCD43C4 was used to study drought in East Africa.The Moderate-resolution
Imaging Spectroradiometer (MODIS) Reflectance product MCD43C4 provides
reflectance data adjusted using a bidirectional reflectance distribution function
(BRDF) to model the values as if they were taken from nadir view. The MCD43C4
product contains 16 days of data provided in a level-3 data set projected to a 0.05 deg
(5600 meters) latitude/longitude Climate Modeling Grid (CMG).

Vegetation NDVI pattern in Horn of Africa in for 25 Nov 2001 and image in the right
corner is for the corresponding date 2010. Images produced from NASA MODIS
MCD43C4 product (LP DAAC, 2010b).
The above figure shows the MODIS NDVI (MCD43C4) product for two different
dates over Horn of Africa and Somalia. Image A is in November 25, 2002 during the
peak of the second rainy season and B is in the corresponding date in 2010. In image
of 2010, greenness value is less in southwest of Somalia and east of Kenya.

23
Spatial variations VHI(Vegetation

Health Index) in Karawang and

Subang Of Indonesia
Drought in Karawang and Subang situated in the north part of West Java were
identified by long term sequence of 2000, 2005, 2010, and 2015 dry season.Landsat
data were used in the present study to monitor drought extent in agricultural fields of
Subang and Karawang. Stacked NDVI layers were produced to identify differences
vegetation health throughout dry season of 2000, 2005, 2010, and 2015.

2019 2021
Drought in Lake Oroville captured by Copernicus Satellite

Lake Oroville, California’s second largest reservoir, is only 35 percent full. A new
satellite image from the Copernicus satellite, obtained by Maxar Technologies,
shows just how low the lake’s water levels are. Lake Oroville is located just over
60 miles north of Sacramento, the state capital.
A significant amount of lake bed has now become exposed due to the dramatic drop
in water level. Boats still dot the lake in the Bidwell Canyon area, but many of the
existing boat ramps are dry.Water levels are so low at the lake that the Oroville
Dam, located on the southwest corner of the lake, is expected to shut down.

6.2:FLOODS:
A Case Study of Kerala Floods 2018: Satellite Remote Sensing has made substantial
contribution in flood monitoring and damage assessment. The unique capabilities
of satellites to provide comprehensive, synoptic and multi-temporal coverage of

24
very large areas at regular interval and with quick turnaround time have been
very valuable in monitoring and managing flood dynamics. The microwave region
of the EM spectrum has a greater potential in identification of flooded areas
because of almost all weather capability. In this study, multi-temporal microwave
SAR data from RADARSAT and SENTINEL-1 satellites was used to map the flood
affected areas in Kerala and to capture the dynamics of the spatial flood extent.

The analysis showed maximum flood inundation in the second flood event in August.
A steady recession of flood was observed in the July flood event. The peak flooding
was observed on 18th August 2018. There is a rapid progression of flood from August
11th to August 18th 2018 after which, a slight recession was observed from August
18th to 21st ,2018. The flood could be seen slightly progressing in thenext interval of
August 21stto 24th ,2018 after which, again an overall recession in the spatial flood
could be observed by 27th August 2018. Kottayam,Pathanamthitta and Alappuzha
were some of the worst flood affected districts. The maximum flood affected district
is observed to be Alappuzha with a maximum flood inundated area of 27691hectares
on 18 August 2018. In the district of Alappuzha a steady progression in the flood is
observed after 21 August. Whereas in the other two districts namely Kottayam,
Pathanamthitta the flood is found to be receeding after August 24, 2018.This study
demonstrated the potential of multi-temporal satellite imageries for capturing the
flood dynamics in a spatial domain. The progression and recession maps can be used
for relief management by the disaster managers on the ground.

Optical mosaic imagery prepared by the National Remote sensing centre, Hyderabad
is used as the master imagery for Kerala, for 2018 . Microwave satellite datasets were
used for complete mapping and analysis of flood in the study area. Six multi-temporal
synthetic
aperture radar (SAR) datasets were used in the study), out of which two imageries are
of Sentinel 1 SAR and the remaining six are from Radarsat 2 SAR.
RADARSAT-2 carries an advanced C-band (5.405 cm), HH-, HV-, and VV-polarized
SAR with a steerable radar beam. Imaging swaths vary from 20 km (Ultra-Fine) to
500 km (ScanSAR
Wide) and resolutions vary from 3 to 100 meters . Sentinel1 carry a C-SAR sensor,
which offers medium and high resolution imaging with Resolution Class of 4 m – 80
m and
swath width of 80 km – 400 km, depending on operation mode in all weather
conditions. The C-SAR is capable of obtaining night imagery and detecting small
movement on the ground,which makes it useful for land and sea monitoring.

25
Pre-flood image on Feb 8 2018 by LANDSAT 8 OLI Sensor

During flood 22 August 2018 by Sentinel 2

Chapter 7 : Tsunami

The phenomenon we call tsunami is a series of large waves of extremely long


wavelength and period usually generated by a violent, impulsive undersea disturbance
or activity near the coast or in the ocean. When a sudden displacement of a large volume
of water occurs, or if the seafloor is suddenly raised or dropped by an earthquake, big
tsunami waves can be formed. The waves travel out of the area of origin and can be
extremely dangerous and damaging when they reach the shore.

The word tsunami (pronounced tsoo-nah'-mee) is composed of the Japanese words "tsu"
(which means harbor) and "nami" (which means "wave"). Often the term, "seismic or
tidal sea wave" is used to describe the same phenomenon, however the terms are
misleading, because tsunami waves can be generated by other, non seismic disturbances
such as volcanic eruptions or underwater landslides, and have physical characteristics
different from tidal waves. The tsunami waves are completely unrelated to the
astronomical tides - which are caused by the extraterrestrial, gravitational influences of

26
the moon, sun, and the planets. Thus, the Japanese word "tsunami", meaning "harbor
wave" is the correct, official and all-inclusive term. It has been internationally adopted
because it covers all forms of impulsive wave generation.

Tsunami waves often look like walls of water and can attack the shoreline and be
dangerous for hours, with waves coming every 5 to 60 minutes. The first wave may not
be the largest, and often it is the 2nd, 3rd, 4th or even later waves that are the biggest.
After one wave floods inland, it recedes seaward often as far as a person can see so the
seafloor is exposed. The next wave then rushes ashore within minutes and carries with
it many floating debris that were destroyed by previous waves. When waves enter
harbors, very strong and dangerous water currents are generated that can easily break
ship moorings, and bores that travel far inland can be formed when tsunamis enter rivers
or other waterway channels.

Causes of Tsunami :

There are different reasons for the occurrence of Tsunami. Among these four major
reasons are listed as under

1. Earthquake :

It can be generated by movements along fault zones associated with plate boundaries.
The region where two plates come in contact is a plate boundary, and the way in
which one plate moves relative to another determines the type of boundary:

1. spreading, where two plates move away from each other;


2. subduction, where two plates move towards each other and one slides beneath
the other;
3. transform where two plates slide horizontally past each other.
Most strong earthquakes occur in subduction zones where an oceanic plate slides
under a continental plate or another younger ocean plate.

All earthquakes do not cause tsunamis. There are four conditions necessary for an
earthquake to cause a tsunami:

27
1. The earthquake must occur beneath the ocean or cause the material to slide
into the ocean.
2. The earthquake must be strong, at least magnitude 6.5 on the Richter Scale
3. The earthquake must rupture the Earth’s surface and it must occur at shallow
depth – less than 70 km below the surface of the Earth.
4. The earthquake must cause vertical movement of the seafloor (up to several
meters).

2. Landslides :

A landslide that occurs along the coast can force large amounts of water into the sea,
disturbing the water and generate a tsunami. Underwater landslides can also result in
tsunamis when the material loosened by the landslide moves violently, pushing the
water in front of it.

3. Volcanic Eruptions :

Although relatively infrequent, violent volcanic eruptions also represent impulsive


disturbances, which can displace a great volume of water and generate extremely
destructive tsunami waves in the immediate source area. According to this
mechanism, waves may be generated by the sudden displacement of water caused by
a volcanic explosion, by a volcano's slope failure, or more likely by a
phreatomagmatic explosion and collapse/engulfment of the volcanic magmatic
chambers.

One of the largest and most destructive tsunamis ever recorded was generated on
August 26, 1883 after the explosion and collapse of the volcano of Krakatoa
(Krakatau), in Indonesia. This explosion generated waves that reached 135 feet,
destroyed coastal towns and villages along the Sunda Strait in both the islands of Java
and Sumatra, killing 36, 417 people.

4. Extraterrestrial Collision :

Tsunamis caused by extraterrestrial collisions (i.e. asteroids, meteors) are an


extremely rare occurrence. Although no meteor/asteroid induced tsunami has been
recorded in recent history, scientists realize that if these celestial bodies should strike
the ocean, a large volume of water would undoubtedly be displaced to cause a
tsunami. Scientists have calculated that if a moderately large asteroid, 5-6 km in
diameter, should strike the middle of the large ocean basin such as the Atlantic Ocean,
it would produce a tsunami that would travel all the way to the Appalachian
Mountains in the upper two-thirds of the United States.

On both sides of the Atlantic, coastal cities would be washed out by such a tsunami.
An asteroid 5-6 kilometers in diameter impacting between the Hawaiian Islands and
the West Coast of North America, would produce a tsunami that would wash out the
coastal cities on the West coasts of Canada, the U.S., and Mexico and would cover
most of the inhabited coastal areas of the Hawaiian islands.

28
Effects of Tsunami :

The effects of a tsunami on a coastline can range from unnoticeable to devastating.


The effects of a tsunami depend on the characteristics of the seismic event that
generated the tsunami, the distance from its point of origin, its size (magnitude) and,
at last, the configuration of the bathymetry (that is the depth of water in oceans) along
the coast that the tsunami is approaching.
Small tsunamis, non-destructive and undetectable without specialized equipment,
happen almost every day as a result of minor earthquakes and other events. They are
very often too far away from land or they are too small to have any effect when they
hit the shore. When a small tsunami comes to the shoreline it is often seen as a strong
and fast-moving tide.
However, when tsunami waves become extremely large in height, they savagely
attack coastlines, causing devastating property damage and loss of life. A small wave
only 30 centimeters high in the deep ocean may grow into a monster wave 30m high
as it sweeps over the shore. The effects can be further amplified where a bay or
lagoon funnels the waves as they move inland. Large tsunamis have been known to
rise to over 100 feet!

1. Destruction :
The amount of energy and water contained in a huge tsunami can cause extreme
destruction when it strikes land.
The initial wave of a huge tsunami is extremely tall; however, most damage is not
sustained by this wave. Most of the damage is caused by the huge mass of water
behind the initial wave front, as the height of the sea keeps rising fast and floods
powerfully into the coastal area. It is the power behind the waves, the endless rushing
water that causes devastation and loss of life. When the giant breaking waves of a
tsunami batter the shoreline, they can destroy everything in their path.

29
2. Death :
One of the biggest and worst effects of a tsunami is the cost to human life because
unfortunately escaping a tsunami is nearly impossible. Hundreds and thousands of
people are killed by tsunamis. Since 1850 alone, tsunamis have been responsible for
the loss of more than 430,000 lives. There is very little warning before a tsunami hits
land. As the water rushes toward land, it leaves very little time to map an escape plan.
People living in coastal regions, towns and villages have no time to escape. The
violent force of the tsunami results in instant death, most commonly by drowning.
Buildings collapsing, electrocution, and explosions from gas, damaged tanks and
floating debris are another cause of death. The tsunami of December 2004 that struck
South East Asia and East Africa killed over 31,000 people in Sri Lanka only, leaving
23,000 injured.

3. Disease :
Tsunami waves and the receding water are very destructive to structures in the run-up
zone. The areas close to the coast are flooded with sea water, damaging the
infrastructure such as sewage and fresh water supplies for drinking.
Flooding and contamination of drinking water can cause disease to spread in the
tsunami hit areas. Illnesses such as malaria arise when water is stagnant and
contaminated. Under these conditions it is difficult for people to stay healthy and for
diseases to be treated, so infections and illnesses can spread very quickly, causing
more death.

4. Environmental Impact :
Tsunamis not only destroy human life, but have a devastating effect on insects,
animals, plants, and natural resources. A tsunami changes the landscape. It uproots
trees and plants and destroys animal habitats such as nesting sites for birds. Land
animals are killed by drowning and sea animals are killed by pollution if dangerous
chemicals are washed away into the sea, thus poisoning the marine life.
The impact of a tsunami on the environment relates not only to the landscape and
animal life, but also to the man-made aspects of the environment. Solid waste and
disaster debris are the most critical environmental problems faced by a tsunami-hit
country.
Recycling and disposal of this waste in an environmentally sensitive manner where
possible (crushing concrete, bricks, etc. to produce aggregate for rebuilding and road
reconstruction) are critical.

5. Cost :
Massive costs hit communities and nations when a tsunami happens. Victims and
survivors of the tsunami need immediate help from rescue teams.

30
Governments around the world may help with the cost of bringing aid to devastated
areas. National institutions, the United Nations, other international organizations,
community groups and NGOs , and a variety of other entities come together to
provide different kinds of aid and services. There might also be appeals and donations
from people who have seen pictures of the area in the media.
Reconstruction and clean up after a tsunami is a huge cost problem. Infrastructure
must be replaced, unsafe buildings demolished and rubbish cleared. Loss of income in
the local economy and future losses from the destruction of infrastructure will be a
problem for some time to come.
The total financial cost of the tsunami could be millions or even billions of dollars of
damage to coastal structures and habitats. It is difficult to put an exact figure on the
monetary cost but the cost may represent an important share of a nation's GDP.

6. Psychological Effect :
Victims of tsunami events often suffer psychological problems which can last for
days, years or an entire lifetime. Survivors of the Sri Lankan tsunami of December
2004 were found to have PTSD (post traumatic stress disorder) when examined by the
World Health Organization (WHO): 14% to 39% of these were children, 40% of
adolescents and 20% of mothers of these adolescents were found to have PTSD 4
months after the tsunami.
These people were suffering from grief and depression as their homes, businesses and
loved ones were taken from them. Many still had PTSD. Perilya Village counts 2,000
dead and 400 families became homeless. These people were found to still have
psychological problems 2 years after the tsunami

31
Assessment and Prediction of Natural Hazards from Satellite Imagery

Since 2000, there have been a number of spaceborne satellites that have changed the
way we assess and predict natural hazards. These satellites are able to quantify
physical geographic phenomena associated with the movements of the earth’s surface
(earthquakes, mass movements), water (floods, tsunamis, storms), and fire (wildfires).
Most of these satellites contain active or passive sensors that can be utilized by the
scientific community for the remote sensing of natural hazards over a number of
spatial and temporal scales. The most useful satellite imagery for the assessment of
earthquake damage comes from high-resolution (0.6 m to 1 m pixel size) passive
sensors and moderate resolution active sensors that can quantify the vertical and
horizontal movement of the earth’s surface. High-resolution passive sensors have
been used to successfully assess flood damage while predictive maps of flood
vulnerability areas are possible based on physical variables collected from passive and
active sensors. Recent moderate resolution sensors are able to provide near real time
data on fires and provide quantitative data used in fire behavior models. Limitations
currently exist due to atmospheric interference, pixel resolution, and revisit times.
However, a number of new microsatellites and constellations of satellites will be
launched in the next five years that contain increased resolution (0.5 m to 1 m pixel
resolution for active sensors) and revisit times (daily ≤ 2.5 m resolution images from
passive sensors) that will significantly improve our ability to assess and predict
natural hazards from space.

Recent Satellite and Sensors :

There have been over 73 successful launches of earth observing satellites between
2000 and 2006. Most of these satellites contain passive or active sensors that can be
utilized by the scientific community for the remote sensing of natural hazards. Passive
sensors record reflected (visible and infrared wavelengths) and emitted energy
(thermal wavelengths). Since the year 2000, a number of new high-resolution passive
sensors that provide panchromatic, visible, near infrared, and thermal imagery have
become operational. Panchromatic imagery contains the entire wavelength of visible
light and provides a black and white image similar to a black and white photograph.
Currently, spaceborne satellites such as IKONOS 2, EROS, QuickBird 2, SPOT 5,
and OrbView 3 can provide panchromatic imagery with 0.6 m to 2.5 m pixel size and
have repeat times of 3 to 14 days. These satellites can also provide multi-spectral
imagery using individual bands in the visible (blue, green, red) and near infrared
wavelengths to provide true color and infrared imagery. The resolution of this multi-
spectral imagery ranges from 2.5 m to 10 m pixel size and has repeat times of 3 to 16
days. Shortwave infrared and thermal imagery provide information on temperature
and sensors such as ASTER and MODIS have a number of bands that collect this
information. Most shortwave infrared and thermal imagery have relatively large pixel
sizes of 30 m to 1 km but higher temporal resolution with repeat times of one day.
Finally, the Defense Meteorological Satellite Program-Operational Linescan System

32
(DMSP-OLS) and MODIS can provide daily panchromatic nighttime light imagery
(500 m) and thermal imagery (1000 m) at a global spatial scale.

Recent satellites with passive sensors that provide panchromatic (Pan.), visible and
near infrared (VNIR), short wave infrared (SWIR), and thermal data that can be used
in remote sensing of natural hazards.

Active sensors send and receive an energy pulse to create an image. Radar (RAdio
Detection And Ranging) is the most commonly used active remote sensing technique
and can provide moderate resolution imagery and digital elevation models of the
earth’s surface. Unlike passive sensors, radar can penetrate cloud cover, providing
imagery over a target area both day and night regardless of weather conditions. There
are a number of satellites equipped with radar sensors that have been used in the
remote sensing of natural hazards. The pixel size from radar sensors (10 m to 30 m) is
significantly larger than data provided by passive sensors. Radar sensors measure
microwave backscatter within discrete wavelengths or bands such as the X, C, and L
bands. The shortest X (3 cm) and C (5.6 cm) bands reflect off the top of objects on the
earth while longer wavelengths, such as the L band (24 cm), are able to penetrate into
the vegetation and substrate. Recent satellite missions such as ERS-2, SRTM, and
Envisat have provided high-resolution topography imagery using interferometric
synthetic aperture radar (SAR) techniques. Interferometric synthetic aperture radar
sends out a microwave pulse and records the reflected pulse at two points separated
by a baseline distance. The resulting parallax can be used to measure elevation with a
high degree of accuracy. These active systems are able to survey large areas due to
their larger pixel and swath size but have long repeat times between 24 to 46 days.

33
Natural Disaster Identification and Mapping of Tsunami in Indonesia
using Satellite Imagery Analysis (Case Study: Aceh )

Remote sensing technology, especially using satellite images, has become essential
support in many aspects of decision-making, particularly in disaster risk management.
It requires a shorter period of data updates and less cost compared to conventional
field observations and surveys. Yet, the intensive processing and high-powered
computing resources are necessary to analyze satellite imagery data through
Geographic Information System (GIS). In this paper, we introduce the identification
and mapping of natural disaster impact in Indonesia using the open-source
collaborative tool of Google Earth Engine (GEE) application which analyzes the
relative temporal difference of Earth surface from three major satellite images:
Sentinel-1, Sentinel-2, and Landsat8. Taking the advantage of the geographical,
geological, and demographic conditions of Indonesia's disaster-prone areas, we
analyze relative differences from normalized difference vegetation index (NDVI) out
of months before and after natural disaster occurrence to measure the impact of
natural disaster in focus study areas. Given the high-vegetation nature of three main
natural disaster impacted areas in Indonesia: Aceh, Palu, and Yogyakarta, we are able
to simplify the analysis by highlighting areas with vegetative loss or gain after the
event. Using an open-source GEE application, namely HazMapper, we identify and
visualize the aftermath of the tsunami disaster in Aceh and Palu as well as the
earthquake in Yogyakarta. Our study is potentially beneficial for government and
decision-makers to utilize publicly available satellite images for disaster recovery and
mitigation policy.

Tsunami Disaster in Aceh

The tsunami which was generated by a powerful earthquake in the Indian Ocean
through 2004 had a massive damage on the coastal regions, including severe damage
to the ecosystem, destruction of housing as well as other buildings, lakes and
farmlands, infrastructures, economic growth, environmental damage of waterbodies,
tsunami disposal, etc. (Rosyidie, 2006). According to Meteorology, Climatology, and
Geophysical Agency of Indonesia (BMKG), The devastating earthquake measuring
9.1 on the Richter, 25 km northwest of Aceh was triggered by a tectonic cause. It was
the emergence of a series of massive aftershocks along the AndamanNicobar
Megathrust line after the main earthquake. In addition to the coastal areas, rdNDVI
captured vegetation loss obtained along the city of Banda Aceh. The rdNDVI results

34
represent severe damage caused by tsunami waves and water flood through miles
from shore

Indian Ocean Tsunami 2004

About two hours after the initial magnitude 9 earthquake shook the seafloor southwest
of the Island of Sumatra, the joint US/French Jason satellite was flying over the
Indian Ocean taking measurements of the height of the sea surface for ocean
circulation and climate studies.

Along the satellite's ground track, traversing the Bay of Bengal, Jason's radar
altimeter measured the height of the tsunami waves as they radiated from the
epicenter. A maximum sea surface elevation of 50cm was measured about 1200 km
south of Sri Lanka at the leading crest of a tsunami wave, followed by a trough in the
sea surface level of 4cm. The wavelength is about 800 km.

This leading wave was followed by a second one with a crest height of 40cm. Near
the northern end of the Bay, two waves with crest heights of 40 cm and 20cm were
just about approaching the coasts of Myanmar. When the waves arrived at the
beaches, the wave speed reduced from that of a jet plane to around 15 km/h with
tremendous amplification of the crest height to around 10m, resulting in massive local
destruction.

35
TOPEX/POSEIDON, flying simultaneously along a track about 150 km to the west of
Jason, made similar observations of the tsunami waves. The agreement of the two
independent observations lends confidence to the measurements.

The observations made by Jason and TOPEX/POSEIDON provide the first large-scale
open ocean data of a major tsunami event. These data are of great value for testing
and improving computer models of tsunami, which are critical for early warning
systems. However, it should be pointed out that, while these observations happened to
be in the right place at the right time, this instrument is not suitable for use as an
operational warning system because of the infrequent coverage from a NADIR
viewing instrument on a polar orbiting satellite.

It should be noted that the anomalies of sea level height caused by a tsunami cannot
be detected in NOAA AVHRR or Meteosat imagery. With a sub-satellite sampling
distance of 1km, MSG (Meteosat Second Generation) is also not able to detect the
inundated areas along the coast of eastern Africa. The high-resolution visible images
from 25 and 27 December 2004 (see below) show no changes of the coastlines of
Somalia and Kenya. It might be possible to detect inundated areas in Meteosat-5
images in northern Sumatra, where the inundations have covered a larger part of the
coastline, but the Meteosat-5 satellite has a poorer horizontal resolution.

36
Chapter 8 : Change in the forest cover

Assessment of forest cover using satellite data on a two-year cycle has been one of the
most important activities of FSI since 1986. The present assessment is the 9th
assessment in this series. Forest cover is defined as an area more than 1 ha in extent
and having tree canopy density of 10 percent and above. This definition is based on
the resolution of digital satellite data (pixel size 23.5m x 23.5m), scale of
interpretation (1:50,000) and the technique employed for image processing. No
distinction with respect to the type of tree crops (natural or man made) or tree species
has been attempted since robust techniques are not available for making such
distinction. Moreover, no cognizance of the type of land ownership or land use or
legal status of land was taken as georeferenced maps depicting such information were
neither available nor possible to collect at country level. Thus, all species of trees
(including bamboos, fruits or palms, etc.) and all types of lands (forest, private,
community or institutional) satisfying the basic criteria of canopy density of more
than 10 percent have been delineated as forest cover while interpreting satellite data.
The minimum area of 1 ha for forest cover has been kept because this is the smallest
area that can be delineated on a map at 1:50,000 scale.

Satellite Data and its Period :

The present assessment is based on digital interpretation of satellite data for the entire
country. The satellite data was procured from the National Remote Sensing Agency
(NRSA), Hyderabad in digital form. For the present assessment, LISS-III sensor data
of IRS-1D satellite with a resolution of 23.5 m has been used. Data for nearly all the
states pertained to the period from October to December 2002. These are the months
when cloud cover is low and the deciduous trees still have leaves to provide
satisfactory reflectance for the satellite sensors. It may be mentioned here that one
scene of LISS III covers an area of about 20,000 km2 (140 km x 140 km). Due to
considerable overlap (15 to 20 percent) among adjacent scenes, as many as 391 scenes
are required to envelope the entire country. Also, at the border of the country or for
islands, the whole scene has to be procured through area of interest may be very small
part of the scene. While procuring data, only those scenes were selected where cloud
cover was less than 10 percent.

Methodology :

Using Digital Image Processing (DIP) software, digital data from satellite available
on CDs is downloaded on the Workstation. Radiometric and contrast corrections were
applied for removing radiometric defects and for improving visual impact of the False
Colour Composites (FCC). Geometric rectification of the data was carried out with
the help of scanned SOI toposheets. Based on tone and texture the forest cover areas
were delineated. Interpretation of forest cover for the whole country was done at
1:50,000 scale using polyconic projection. Normalized Difference Vegetation Index
(NDVI) transformation was also used for density classification of forest cover. Areas
of less than one hectare, whether classified as forest within non-forest areas or blanks
within forested areas, were excluded by clustering pixels and merged with the
surrounding class.

37
Study of Forest Cover Change Dynamics between 2000 and 2015 in the Ikongo
District of Madagascar Using Multi-Temporal Landsat Satellite

Since the last two decades, remote sensing has been widely used to detect changes in
phenomena such as urban areas, land cover and also in forest cover. On the one hand,
thanks to the availability of satellite imagery researchers can perform not only multi-
temporal but also time series change studies. On the other hand, computer-based
techniques are able to determine the effects of a phenomenon in an area at different
dates. Then, the change detection algorithm is one of the techniques that can carry out
this work. Moreover, its main operation is the analysis of before and after situations
on a surface. Undoubtedly, the forestry domain really requires the intervention of the
change detection algorithm to quantify changes.
For Madagascar, the study based on satellite image processing started since the year
80. It was a vegetation study conducted as part of his thesis. A few years later,
analyses applied to forest cover were carried out by scientific researchers with the aim
of verifying hypotheses. In general, these works are divided into two categories such
as national works doing image processing of the whole of Madagascar like the works,
as well as specific studies targeting one or a few study areas like the works of.
Furthermore, it seems that most of these studies focus on the evolution or assessment
of changes in forest cover , although the forest area of this country is decreasing due
to massive exploitation in the forest zone, charcoal and especially tavy cultivation
In this study, we focus particularly on the use of the Erdas Imagine “Discriminant
function change” algorithm because this algorithm is not only efficient but it is also
often used to detect the following change in time. Our objectives are not only to
evaluate the effectiveness of this algorithm on the study of dynamics of change but
also to quantify the deforestation of the Ikongo District of Madagascar over 15 years.
Secondly, we will use Landsat TM satellite image for the year 2000, 2005, 2010 and
also a Landsat ETM+ satellite image for the year 2015 of this study area. Thus, in
order to carry out this study, we will define four parts. Firstly, we will discuss the
materials and methods. Secondly, we will present the results obtained on the
classifications and also we will show the results of the change detection of the year
2000-2005, 2005-2010, 2010-2015. Thirdly, we will discuss the results obtained.
Finally, we will make the conclusion.
In the case of this study, we will use images from the Landsat 5 TM sensor and
Landsat 7 ETM+ sensor. These two satellites have remarkable characteristics in
terms of the number of spectral bands and also the presence of panchromatic bands

38
39
Acknowledgement

We would like to thank almighty god for giving us luck & faith in our abilities to
complete this. We would like to express our deep sense of gratitude to our teachers for
extending us the opportunity for this and providing all the necessary resources and
expertise for this purpose. We would like to convey our deepest gratitude to our
mentor Prof. (Dr.) Anindya Sen for sparing his valuable time for us to discuss and
clarify issue connected with this Project report.

40
References

• https://www.intechopen.com/chapters/76245
• https://www.eoportal.org/satellite-missions/grace-fo#some-project-
development-status-dates
• https://earthobservatory.nasa.gov/images/148447/california-reservoirs-reflect-
deepening-drought
• https://earthzine.org/climate-change-effect-on-glacier-behavior-a-case-
study-from-the-himalayas-2/
• https://argo.ucsd.edu/about/
• https://ieeexplore.ieee.org/document/1526109
• Mohammed Alwesabi (2012). MODIS NDVI satellite data for assessing
drought in Somalia during the period 2000-2011.
• Faheed, J. K. P., Yusuf, A., Manjusree, P., & Ansari, Z. R. (2022). Monitoring of
Flood Dynamics Using Multi Temporal Satellite Data:A Case Study of Kerala
Floods 2018. European Journal of Applied Sciences, 10(5).443-480.
• https://www.scirp.org/journal/paperinformation.aspx?paperid=111493
• https://www.geeksforgeeks.org/satellite-image-processing/
• https://earthobservatory.nasa.gov/blogs/earthmatters/2020/04/09/nimbus-
4-a-satellite-pioneer-in-ozone-research/#comment-4781
• https://www.eumetsat.int/satellite-observations-indian-ocean-tsunami

41

You might also like