You are on page 1of 58

Acknowledgment

We would like to pay special thankfulness, warmth and appreciation to the persons below who
made this internship successful and assisted us at every point to cherish our goal. First we would
like to take this opportunity to express our profound gratitude, warmth and deep regard to Mr
Lemi, training team leader, for his exemplary effort in coordinating this internship program in
the institute. We would like to express our warmth and deep thanks to Mr. Mesfin, GPS/GNSS
trainer, who gave us the internship course scarifying his time and job, for his motivation to select
good, advance and new knowledge acquirable internship course. Finally, we would like to pay
our sincere gratitude to Mr. Yohannes, higher Geo-Information researcher and trainer, for his
valuable feedback and constant encouragement throughout the duration of the internship. His
valuable suggestions were immense and help us throughout the internship. His perspective
criticism kept us working in this internship program in much better way. Working under his
supervision was an extremely knowledgeable experience for us.
Executive summary
The traditional, and largest, application of photogrammetry is to extract topographic information
(e.g., terrain models) from aerial images. Planimetric elements in geography are those features
that are independent of elevation, such as roads, building footprints, and rivers and lakes. They
are represented on two-dimensional maps as they are seen from the air, or in aerial photography.
These features are often digitized from orthorectified aerial photography into data layers that can
be used in analysis and cartographic outputs.

Topographic map is a type of map characterized by large-scale detail and quantitative


representation of relief, using contour lines. Traditional definitions require a topographic map to
show both natural and man-made features. A topographic map is typically published as a map
series.

Photogrammetry is preferred for Extracting geometrical information and producing maps,


Cheaper than terrestrial methods, Extracting qualitative information and High speed of map
generation.

List of acronyms

EGII: Ethiopian Geospatial Information Institute

GII: Geospatial Information Institute


FDRE: Federal Democratic Republic of Ethiopia

GCP: Ground Control Point

GPS: Global Positioning System

GIS: Geographic Information System

ERDAS: earth Resource Data Analysis System

R&D: Research and Development

2D: two dimensional

3D: three-dimensional.

i.e.: that is; meaning

A: Autograph of first order (wild)

PG: photogrammetric instrument (kern)

DEM: Digital Elevation Model

DTM: Digital Terrain Model

DSM: Digital Surface Model

Table of contents

LIST OF FIGURES
Figure 1:- trilateration......................................................................................................................7
Figure 2:- GPS/GNSS Segments.....................................................................................................7
Figure 3:- Levelling and old types of total station instruments.......................................................9
Figure 4:- old GPS antenna tripod...................................................................................................9
Figure 5:- GPS antenna and internal battery..................................................................................10
Figure 6:- receiver, controller and meter.......................................................................................10
Figure 7:- Sources of GPS Errors..................................................................................................13
Figure 8:- aerial photographs.........................................................................................................15
Figure 9:- Optical axis aerial photographs.....................................................................................15
Figure 10:- classifications of aerial photograph............................................................................17
Figure 11:- Overlap: end lap/forward lap......................................................................................17
Figure 12:- Overlap: end lap/forward lap orthophoto....................................................................18
Figure 13:- 2D photogrammetry group..........................................................................................18
Figure 14:- 3d photogrammetry group..........................................................................................19
Figure 15:- Aerial photo deformation............................................................................................19
Figure 16:- types of GIS data(1)....................................................................................................24
Figure 17:- types of GIS data(2)....................................................................................................24
Figure 18:- types of GIS data(3)....................................................................................................24
Figure 19:- process of remote sensing...........................................................................................25
Figure 20:- Electromagnetic Spectrum..........................................................................................27
Figure 21:- Aerial photographs......................................................................................................37
Figure 22:- The procedures we have been using in performing the work task..............................39
Figure 23:- creation of new project area........................................................................................39
Figure 24:- defined camera model.................................................................................................40
Figure 25:- The exterior orientation..............................................................................................42
Figure 26:- Output of the added imagery......................................................................................42
Figure 27:- DTM extraction and contour.......................................................................................44
Figure 28:- Orthorectification of the images.................................................................................45
Figure 29:- Sub setting the orthophotos.........................................................................................46
Figure 30:- The Mosaic process....................................................................................................46
Figure 31:- subsetting the mosaic..................................................................................................47
Figure 32:- The Orthophoto...........................................................................................................49

LIST OF TABLES
Table 1:- GPS Data Collection Methods.......................................................................................11
Table 2:-The three GPS points......................................................................................................38
Table 3:-The GCP points...............................................................................................................38

CHAPTER ONE
BACKGROUND OF ETHIOPIAN GEOSPATIAL INFORMATION
INSTITUTE

1.1 Introduction
The Ethiopian Geospatial Information Institute has been originally established in1954 as the
Mapping and Geography Institute under the Ministry of Education and Fine Arts of the time for
preparing maps for teaching purpose in schools. Later in 1962 its status and mandate was
enhanced to include the preparation of topographic maps and its administration was transferred
to the Ministry of the Interior maintaining its name as the mapping and geography institute of
Ethiopia.

Since then it has passed through various organizational steps until its establishment, on oct.20,
1980 as the Ethiopian mapping agency, under the proclamation no. 193/1980. Later on, in view
of advances in information technology, the need to re-establish the institute with enhanced and
updated mandates was recognized and accordingly, the institute was re-established as the
Ethiopian geospatial information institute under council of ministers regulation no.440/2018
dec.25, 2018. The main responsibility of GII as described in the regulation is the generation,
compilation, analysis, production/ publication, administration and distribution of the following
sets of fundamental geo-spatial information data sets in Ethiopia:

• GCP

• Remotely sensed imagery (aerial photographs, satellite imagery)

• Topographic /base maps

• Thematic maps (transportation, utilities and services, the natural environment, and
tourist maps etc.)

• Hypsography (contours, DEM, spot heights, etc.)

• Hydrography (lakes, rivers, streams etc.)

• Administrative boundaries (international, national, regional, zonal, woreda, etc.)

• Atlas(national, regional)

• Geographic names

Furthermore, GII is the authorized institute of the FDRE empowered to regulate and supervises
the quality and standards of geospatial information products and producers in Ethiopia. It is also
mandated to provide basic trainings on different geospatial information production and
dissemination methods and techniques.

Based on its mandates, the institute has been producing various fundamental and thematic
geospatial information products/services that have played a significant role in the development
endeavour of our country. Its products to date include topographic maps (1:2000000, 1:1000000,
1:250000, 1:50000 scale) and large scale maps of some areas of interest; national atlases, GCP
and thematic maps used for the socio- economic development of the country. As a result of the
current continued and fast economic and social development in Ethiopia, the need for reliable
and timely geospatial information has become very high. To satisfy these growing needs, the
agency is working hard to produce and avail demand driven reliable, quality and timely
authoritative geospatial information products in Ethiopia.
1.2 The Core Processes of The Institute
The organizational structure of EGII consists of 14 directorates and the office of director general.
Among the 14 directorates, 8 are support processes and 6 are the core processes of the institute.
The six core processes of the institute are the survey directorate, mapping directorate, quality and
standard directorate, information communication directorate and training, research and
development directorate. These directorates are the technical directorates that are mainly
responsible for the collection, analysis, production, dimension and control of the geospatial
information products.

1.2.1 Directorate of Surveying


The directorate has three teams:
1. the geodetic survey team
2. large scale survey team
3. office engineering and survey computation team

1.2.2 Directorate of Mapping


The directorate has five teams: namely:
1. Digital photogrammetry team
2. Digital cartography team
3. Digital Ortho photo team
4. Reprography team
5. Geographical names collection and gazetteer team

1.2.3 Directorate of GIS And Remote Sensing


• The directorate has three teams: namely:
1. Remote sensing team
2. Map revision and analogue to digital conversion team
3. GIS and thematic team
1.2.4 Directorate of Quality And Standards
• The directorate has two teams, namely:
1. Quality control team
2. Quality assurance team

1.2.5 Information Communication Technology Directorate


• The directorate has two teams, namely:
1. Data base administration team and
2. System administration team

1.2.6 Training, R&D Directorate


• The directorate has two teams, namely
1. Training team and
2. R&D team

1.3 Main services delivered by GII


1. Producing different scale topographic maps
2. Producing different thematic maps based on demand
3. Providing geo-referencing, scanning and digitizing service by using GIS technology
4. Providing remotely sensed satellite data
5. Establishing and administering national GCP network
6. Establishing GCP based on request
7. Controlling the quality and standard of geo-spatial information products in Ethiopia
8. Certifying qualified experts and institution working in geospatial sectors
9. Providing basic medium and short term trainings on GIS, remote sensing, mapping , surveying
and other related geospatial information
10. Providing advisory service for public and private sectors on any geospatial information
production and utilization
1.4 Main customers of GII
• Sugar corporates
• Municipalities
• National defense force
• Governmental project implantations
• Private organizations for implanting their institutes
• Road construction
• Tourists
• Hotels
• Every governmental or non-governmental institute
• Governmental and private schools etc.

CHAPTER TWO
THE OVERALL INTERNSHIP EXPERIENCE

2.1 How we get into the institute


Based on our interest and the information given by the department we started searching for the
perfect institute. It’s obvious to understand, that GII is one of the biggest institutes in Ethiopia, as
we can understand from its background history. Based on this information, we applied to the
institute to get accepted as a trainee and luckily we get accepted in the department of
photogrammetry. As we can understand from Photogrammetry definition it is the “art, science
and technology of obtaining reliable information about physical objects and the environment
through the process of recording, measuring and interpreting photographic images and patterns
of electromagnetic radiant imagery and other phenomena”
Photogrammetry is the basis for the production of Orthophotos. Photogrammetry is used in the
fields such as: topographic mapping, architecture, engineering, manufacturing, quality control,
police investigation, cultural heritages, and geology etc.

2.2 Objective of the Internship

2.2.1 General objective


This internship program is tended to link students with industries so that they practice and
operational knowledge in various section of industry.

2.2.2 Specific objective


• To practice on photogrammetry

• To develop new knowledge regarding photogrammetry and map production

• To get updated knowledge of the modern photogrammetric stage

• To see the pre-modern stages of photogrammetry

• To make students able in the production of Orthophotos and in 2D map creation

• To make us expert in digital photogrammetry

2.3 The work flow of the department


The work flow in the department is bounded with other departments in the institute like
surveying, cartography, remote sensing departments. Once the aerial photographs were taken, the
surveyors create GCP points in the area and deliver it to the photogrammetry department. Then
the professionals in this department use it to produce Orthophotos and these Orthophotos are
digitized in this department. Finally
the outputs will be sent to cartographic department to beautifully produce maps.

2.4 Review on GPS/GNSS


In this part of learning we knew some essential knowledge about what a GPS/GNSS mean and
how and why it is used.
GNSS ( HYPERLINK "https://www.thegeospatial.in/how-does-gnss-work"Global Navigation
Satellite System) is the generic term for satellite navigation systems that provide autonomous
geo-spatial positioning with global coverage. This term includes GPS, GLONASS, Galileo,
Beidou, IRNSS and other regional networks.The advantage of having access to multiple satellites
is accuracy, redundancy, and availability at all times. Though satellite systems don’t often fail, if
one fails, GNSS receivers can pick up signals from other systems. Also, if the line of sight is
obstructed, having access to multiple satellites is also a benefit.

2.4.1 How does the GPS/GNSS works?


• Satellite circles the Earth and transmits signal

• Signal contains time it was sent and its location

• Signal travels at the speed of light

2.4.2 Trilateration
Trilateration is the Process of measuring the distance from at least three satellites .Three
satellites calculate 2D position (Latitude and Latitude).Four or more satellites calculate 3D
position (Latitude, Longitude, and Altitude).GPS satellite transmits data that indicates its
location and the current time. All GPS satellites synchronize operations so that these repeating
signals are transmitted at the same instant. That time is subtracted by the time received by the
receiver and distance is calculated. These Signals Travels at speed of Light. When the receiver
estimates the distance to at least four GPS satellites, it can calculate its position in three
dimensions.
Figure :- trilateration
2.4.3 GPS/GNSS Segments
There are 3 types of segments

• Space segment.

• Control segment.

• User segment.

Figure :- GPS/GNSS Segments


Space Segment

GPS satellites fly in circular orbits at an altitude of 20,200 km and with a period of 12 hours.
Powered by solar cells. The satellites continuously orient themselves to point their solar panels
toward the sun and their antenna toward the earth. Orbital planes are centered on the Earth.
Orbits are designed so that, at least, six satellites are always within line of sight from any
location on the planet.

Control Segment
The CS consists of 3 entities:
Master Control Station:-The master control station, located at Falcon Air Force Base in
Colorado Springs, Responsible for overall management of the remote monitoring and
transmission sites. Check-up is performed twice a day, by each of 6 stations, as the satellites
complete their journeys around the earth. Can reposition satellites to maintain an optimal GPS
constellation.
Monitor Stations:-Checks the exact altitude, position, speed, and overall health of the orbiting
satellites. The control segment ensures that the GPS satellite orbits and clocks remain within
acceptable limits. A station can track up to 11 satellites at a time. This "check-up" is performed
twice a day, by each station. Falcon Air Force Base in Colorado, Cape Canaveral, Florida,
Hawaii, Ascension Island in the Atlantic Ocean, Diego Garcia Atoll in the Indian Ocean,
Ground Antennas:-Ground antennas monitor and track the satellites from horizon to
horizon.They also transmit correction information to individual satellites. Communicate with the
GPS satellites for command and control purposes.
User Segment
GPS receivers are generally composed of an antenna (tuned to the frequencies transmitted by the
satellites),receiver-processors, and highly-stable clock (commonly a crystal oscillator).They can
also include a display for showing location and speed information to the user. A receiver is often
described by its number of channels (this signifies how many satellites it can monitor
simultaneously).As of recent, receivers usually have between twelve and twenty channels.
2.4.4 Levelling and old types of total station instruments

Figure :- Levelling and old types of total station instruments


2.4.5 GPS instruments

Figure :- old GPS antenna tripod


Figure :- GPS antenna and internal battery
Fig-

Figure :- receiver, controller and meter


2.4.6 GPS Data Collection Methods
Various methods are used to collect high precision differential GPS data. The particular method
used depends on several factors, including survey objectives, desired precision, available
equipment, and field logistics. Higher precision typically requires a more rigorous field
methodology and longer occupation times. The following table shows the features of the most
common GPS survey methods.
Table :- GPS Data Collection Methods

Cores:-Stations are continuously-operating long-term or permanent GNSS station installations


involving immobile documentation and sustainable power, and often involving data telemetry.
The can be used as pre-existing base stations in campaign surveys (static, rapid static, and
kinematic).
Static:-Surveys are regional, sub-cm precision GNSS surveys with portable equipment and are
the standard campaign data collection method for crustal deformation surveys. They typically
involve occupying each point for several days to get the highest possible accuracy. Collect at
least 6 hours of simultaneous data per day for processing and repeat benchmark occupations if
possible.
Rapid static:-Surveys are static surveys with just enough survey time at each point to be able to
resolve the carrier phase integer ambiguity. A rule of thumb is to collect data for a minimum of
10 minutes per point, and add one minute of occupation time per kilometer of baseline length
over 10 kilometers. For example, on an eight-kilometer baseline collect at least 10 minutes of
data and on a 28-kilometer baseline collect at least 28 minutes of data.
Kinematic:- surveys are local surveys (<10 km) using mobile GNSS equipment for the purpose
of mapping features or of measuring point locations where several cm of precision is sufficient.
At least two receiver set-ups are required: a base (stationary) unit and one or more rover (mobile)
units. Kinematic surveys rely on continuous tracking to resolve the integer ambiguity; while the
rover receiver/antenna may be moving during the surveys, continuous lock on the satellite
signals must be maintained. Since the data processing software is able to both resolve the
ambiguity and track the antenna motion, fixed-integer solutions are obtained nearly
instantaneously.
Post-processing kinematic (PPK):- refers to surveys without communication between the base
and rover receivers. Processing the data after data collection is required. There are no
navigational capabilities in PPK surveys.
Real-time kinematic (RTK):- refers to surveys in which the base and rover receivers
communicate corrections in real-time via a radio link. This requires additional hardware (base
and rover radios) and additional power, and generally limits the survey to an area of several km,
but eliminates the need for data processing and enables navigational capabilities.
Single Point positioning:-Uses only data from a single receiver to determine its coordinates. The
collected data are averaged, and longer occupations significantly increase the accuracy. This
method is very coarse, but sometimes it is the only way to determine base station coordinates
while in the field.

2.4.7 Sources of GPS Errors


In order to effectively gather precise/accurate data, it is necessary to understand potential sources
of error that can affect GPS data quality

Multipath :-Errors caused by reflected GPS signals arriving at the GPS receiver, typically as a
result of nearby structures or other reflective surfaces (e.g. buildings, water). Signals traveling
longer paths produce higher (erroneous) pseudo range estimates and, consequently, positioning
errors. The user should be aware that multipath errors are not detectable or correctable with
recreational grade GPS receivers. Some mapping grade GPS receivers as well as most or all
survey grade GPS receivers have antennas and software capable of minimizing multipath signals.

Atmospheric :-GPS signals can experience some delays while traveling through the atmosphere.
Common atmospheric conditions that can affect GPS signals include tropospheric delays and
ionospheric delays. Tropospheric delays have the capability of introducing a minimum of 1-
meter variance. The troposphere is the lower part (from ground level to 13 km) of the
atmosphere that experiences the changes in temperature, pressure, and humidity associated with
weather changes. Complex models of tropospheric delay require estimates or measurements of
these parameters.
Distance from Base Station:-While differential correction will increase the quality of the data,
accuracy is degraded slightly as the distance from the base station increases. Users should use the
nearest base station to where the data is being collected.

Noise:-Noise error is the distortion of the satellite signal prior to reaching the GPS receiver
and/or additional signal “piggybacking” onto the GPS satellite signal. All three grades of GPS
receivers are capable of suffering from noise error. The amount of error due to noise cannot be
determined.

Figure :- Sources of GPS Errors


2.5 Definition of photogrammetry
Photogrammetry It is the art, science and technology of obtaining reliable information about the
Physical Objects and Environment through processes of Recording, Measuring and Interpreting
Photographic Images and patterns of electromagnetic radiant energy and other phenomena. An
art, because obtaining reliable measurements requires certain skills, techniques and judgments to
be made by an individual. It is a science and a technology, because it takes an image and
transforms it, via technology, into meaningful results.

The word “photogrammetry” derived from three Greek words: PHOTOS means “Light”,
GRAMMA means “something drawn or written” METRON means, “to measure”. The root
words, therefore originally signified measuring graphically by means of light. On this basis
photogrammetry till recently has been defined the as the science or art of obtaining reliable
measurements by means of photographs
2.5.1 Photo Interpretation Elements
There are seven photo interpretation elements these are: tone, shape, size, pattern, texture,
shadow and site

Tone/hue:- Tone refers to relative brightness of a black/white image, Hue refers to the color on
the image. Example: different types of rocks, soils or vegetables are most likely having different
tones. Variation of moisture content reflects tone difference.

Shape or form:-characterized many objects visible in an image. The shape of objects often helps
to determine the character of the object (built-up areas, roads, agricultural fields, rivers, etc…)

Size:-size of object can be considered in a relative or absolute sense. Relative size of an


unknown object in relation to a known object, e.g single lane road Vs. multiline road, Railway
Vs. road Absolute Measurement of the object surface area

Pattern:-refers to the spatial arrangement of objects and implies the characteristics repetition of
certain forms or relationships (river with its branch, car parking, patterns related to erosion,
irrigation type, planned versus unplanned settlement etc.…).

Texture:-relates to the frequency of tonal change. Texture may be described by terms as coarse
or fine, smooth or rough, even or uneven. A pattern on a large scale image may show as texture
on a small scale image of the same scene

Shadow

Site/Association

• Association: refers to the fact that a combination of objects makes it possible to infer
about its meaning or function (interpretation of hotel based on the combined recognition
of swimming pool, many rooms, garden, large building, parking area)

• Site: Relates to the topographic or geographic location (transportation means on the lake
is likely to be a boat, we would not expect a car)
2.5.2 Aerial photographs
Aerial photograph is defined as art ,science and technology of taking aerial photographs from an
air-borne platform

Figure :- aerial photographs


Aerial photographs can be classified on the basis of : Axis, scale film and angular coverage.

2.5.3 Optical axis aerial photographs


• Vertical photographs- the camera tilt is up to + 1-3º

• Low oblique photographs- the camera tilt is 3-15º

• High oblique photographs- the camera tilt is >15º

Figure :- Optical axis aerial photographs


2.5.4 Classification of aerial photographs
Classification of aerial photographs based on scales

• Large scale photographs 1:5,000-1:20,000

• Medium scale photographs 1:20,000-1:50,000

• Small scale photographs smaller than 1:50,000

Note: scale classification may differ from country to country

Classification of aerial photographs based on film

• Black and white aerial photographs (panchromatic)

• Color aerial photographs

• Infrared aerial photographs

Classification of aerial photographs based on angular coverage

• Narrow angle : angle of coverage less than 60º

• Normal angle : angle of coverage less than 60-75º

• Wide angle : angle of coverage less than 75-100º

• Super wide angle : angle of coverage greater than 100º


Figure :- classifications of aerial photograph
2.5.5 Overlap: end lap/forward lap
End lap and forward lap is the overlapping of successive photos along a flight line. For
stereoscopic viewing,(60+_5)% end lap is necessary.in hilly area,70-90% necessary because of
coverage gap produced by tilt, crab, flying height variation, terrain variation etc.….

Figure :- Overlap: end lap/forward lap


Figure :- Overlap: end lap/forward lap orthophoto
2.5.6 The digital photogrammetry group
In the digital photogrammetry team we saw how maps are produced using orthophoto i.e 2d
maps

Figure :- 2D photogrammetry group


2.5.7 The 3D photogrammetry group
In the 3d photogrammetry team we saw how they use the generated DTM and draw river lines
and mountains etc.,,, to produce 3d maps
Figure :- 3d photogrammetry group
2.5.8 Aerial photo deformation
Aerial photo can be deformed due to the motion of the air craft at the time of photography the
aircraft may be roll, wing up and wing down and nose up and down or tail up and down
movement.

Figure :- Aerial photo deformation


2.5.9 Distortion and Displacement
Distortion in aerial photography is defined as any shift in the position of an image on a
photograph that alters the perspective characteristics of the image. Displacement is any shift in
the position of an image on a photograph that does not alter the perspective characteristics of the
photograph.

Reasons for Distortions

• Movement of camera

• Instability of aircraft

• Variation in altitude , tilt and speed

• Curvature of earth

• Rotation of Earth

• Perspective view

2.6 Cartography
Cartography is not a standalone field of study it interacts with different fields of study. Now a
days it is highly works with technology especially with software technologies. The aim of this
manual is to help and assist employees in executing cartographic tasks and remind some
forgotten rules while doing their job.

2.6.1 Types of Map


Physical Maps: Physical maps show natural features such as relief, geology, soils,
drainage, elements of weather, climate and vegetation, etc.

Cultural Maps: Cultural maps show man-made features. These include a variety of maps
showing population distribution and growth, sex and age, social and religious composition,
literacy, levels of educational attainment, occupational structure, location of settlements,
facilities and services, transportation lines and production, distribution and flow of different
commodities.
2.6.2 Processes of Map
Map-making, does include a series of processes that are common to all the maps. These
processes referred to as elements of maps. These are Scale, Map Projection, Map Generalization,
Map Design, Map Construction and Production.

Scale: We know that all maps are reductions. The first decision that a map-maker has to take is
about the scale of the map. The scale of a map sets limits of information contents and the degree
of reality with which it can be delineated on the map. On the basis of scale, maps may be
classified into Large scale 500 - <50,000,

Medium scale 50,000 - 250,000 and

Small scale >=250,000

Projection: We also know that maps are a simplified representation of the three-
dimensional surface of the earth on a plane sheet of paper. The transformation of all-side-curved-
geoidal surface into a plane surface is another important aspect of the cartographic process.

Generalization: It is the process of reducing the amount of details in a map. The process of
generalization is normally executed when the map scale has to be reduced. In the
process of cartographic generalization many details are omitted. Generalization in the process of
compilation of the chart content is called cartographic generalization.

Map Design: It involves the planning of graphic characteristics of maps including the selection
of appropriate symbols, their size and form, style of lettering, specifying the width of lines,
selection of colors and shades, arrangement of various elements of map design within a map and
design for map legend.

Map Construction and Production: In earlier times, much of the map construction and
reproduction work used to be carried out manually. Maps were drawn with pen and ink and
printed mechanically. However, the map construction and reproduction has been revolutionized
with the addition of computer assisted mapping and photo-printing techniques.
2.7 GIS (Geographic Information System)

2.7.1 Definition
Geographic Information Science presents a framework for using information theory, spatial
analysis and statistics, cognitive understanding, and cartography . It focuses on the processes and
methods that are used to sample, represent, manipulate and present information about the world

2.7.2 Basic Concept of GIS


• Literal Definition- Geographic relates to the surface of the earth. Information is a
knowledge derived from study, experience, or instruction. System is a group of
interacting, interrelated, or interdependent elements forming a complex whole. Science is
the observation, identification, description, experimental investigation, and theoretical
explanation of phenomena.

• Functional Definition - GIS is a system for inputting, storing, manipulating, analyzing,


and reporting data.

• Component Definition - GIS is an organized collection of computer hardware, software,


geographic data, procedures, and personnel designed to handle all phases of geographic
data capture, storage, analysis, query, display, and output.

2.7.3 Functions of GIS


• Data collection- Capture data

• Data storing, processing & analysis- Store data Query data Analyze data

• Output production- Display data Produce output

2.7.4 Basic Elements of GIS


People- are the most important part of a GIS define and develop the procedures used by a GIS
can overcome shortcoming of the other 4 elements (data, software, hardware, procedure), but not
vice-versa

Date- Data is the information used within a GIS Since a GIS often incorporates data from
multiple sources, its accuracy defines the quality of the GIS. GIS quality determines the types of
questions and problems that may be asked of the GIS
GIS software- It encompasses not only to the GIS package, but all the software used for
databases, drawings, statistics, and imaging. The functionality of the software used to manage
the GIS determines the type of problems that the GIS may be used to solve. The software used
must match the needs and skills of the end user.

Popular GIS Software- Vector-based GIS ArcGIS (ESRI) ArcView MapInfo Raster-based
GIS Erdas Imagine (Leica), ENVI (RSI), ILWIS (ITC) IDRISI.

Hardware- The type of hardware determines, to an extent, the speed at which a GIS will
operate. Additionally, it may influence the type of software used. To a small degree, it may
influence the types/ personalities of the people working with the GIS.

Procedures/ Methods- The procedures used to input, analyze, and query data determine the
quality and validity of the final product.

2.7.5 Types of GIS Data


Vector- In the vector data model, features on the earth are represented as Points, Lines and
Polygons- Raster- In the raster data model, a geographic feature like land cover is represented as
single square cells.

Attribute- Attribute values in a GIS are stored as relational database tables. Each feature (point,
line, polygon, or raster) within each GIS layer will be represented as a record in a table.
Figure :- types of GIS data(1)

Figure :- types of GIS data(2)

Figure :- types of GIS data(3)


2.8 Fundamental of Remote Sensing

2.8.1 What is Remote Sensing?


Remote sensing is an art and science of obtaining information about an object or feature without
physically coming in contact with that object or feature. Humans apply remote sensing in their
day-to-day business, through vision, hearing and sense of smell. The data collected can be of
many forms: variations in acoustic wave distributions (e.g., sonar), variations in force
distributions (e.g., gravity meter), variations in electromagnetic energy distributions (e.g., eye)
etc. These remotely collected data through various sensors may be analyzed to obtain
information about the objects or features under investigation. In this course we will deal with
remote sensing through electromagnetic energy sensors only.
Thus, remote sensing is the process of inferring surface parameters from measurements of the
electromagnetic radiation (EMR) from the Earth’s surface. This EMR can either be reflected or
emitted from the Earth’s surface. In other words, remote sensing is detecting and measuring
electromagnetic (EM) energy emanating or reflected from distant objects made of various
materials, so that we can identify and categorize these objects by class or type, substance and
spatial distribution [American Society of Photogrammetry, 1975].

Remote sensing provides a means of observing large areas at finer spatial and temporal
frequencies. It finds extensive applications in civil engineering including watershed studies,
hydrological states and fluxes simulation, hydrological modeling, disaster management services
such as flood and drought warning and monitoring, damage assessment in case of natural
calamities, environmental monitoring, urban planning etc.

Figure :- process of remote sensing


A: Energy Source or Illumination - the first requirement for remote sensing is to have an energy
source which illuminates or provides electromagnetic energy to the target of interest.
B: Radiation and the Atmosphere - as the energy travels from its source to the target, it will come
in contact with and interact with the atmosphere it passes through. This interaction may take
place a second time as the energy travels from the target to the sensor.

C: Interaction with the Target - once the energy makes its way to the target through the
atmosphere, it interacts with the target depending on the properties of both the target and the
radiation.

D: Recording of Energy by the Sensor - after the energy has been scattered by, or emitted from
the target, we require a sensor (remote - not in contact with the target) to collect and record the
electromagnetic radiation.

E: Transmission, Reception, and Processing - the energy recorded by the sensor has to be
transmitted, often in electronic form, to a receiving and processing station where the data are
processed into an image (hardcopy and/or digital).

F: Interpretation and Analysis - the processed image is interpreted, visually and/or digitally or
electronically, to extract information about the target which was illuminated.

G: Application - the final element of the remote sensing process is achieved when we apply the
information we have been able to extract from the imagery about the target in order to better
understand it, reveal some new information, or assist in solving a particular problem.

2.8.2 The Electromagnetic Spectrum


The electromagnetic (EM) spectrum is the continuous range of electromagnetic radiation,
extending from gamma rays (highest frequency & shortest wavelength) to radio waves (lowest
frequency & longest wavelength) and electromagnetic spectrum ranges from the including
visible light. The EM spectrum can be divided into seven different regions shorter
wavelengths(gamma rays, X-rays),ultraviolet, visible light, infrared, longer wavelengths
(microwaves and radio waves) Remote sensing involves the measurement of energy in many
parts of the electromagnetic (EM) spectrum. The major regions of interest in satellite sensing are
visible light, reflected and emitted infrared, and the microwave regions. The measurement of this
radiation takes place in what are known as spectral bands. A spectral band is defined as a discrete
interval of the EM spectrum. For example the wavelength range of 0.4μm to 0.5μm (μm =
micrometers or 10-6m) is one spectral band

Figure :- Electromagnetic Spectrum


The ultraviolet or UV portion of the spectrum has the shortest wavelengths which are practical
for remote sensing. This radiation is just beyond the violet portion of the visible Wavelengths,
hence its name. Some Earth surface materials, primarily rocks and minerals, fluoresce or emit
visible light when illuminated by UV radiation.

The light which our eyes - our "remote sensors" - can detect is part of the visible spectrum. It is
important to recognize how small the visible portion is relative to the rest of the spectrum. There
is a lot of radiation around us which is "invisible" to our eyes, but can be detected by other
remote sensing instruments and used to our advantage. The visible wavelengths cover a range
from approximately 0.4 to 0.7 µm. The longest visible wavelength is red and the shortest is
violet. Common wavelengths of what we perceive as particular colors from the visible portion of
the spectrum are listed below. It is important to note that this is the only portion of the spectrum
we can associate with the concept of colors.

The electromagnetic spectrum can be divided into several wavelength (frequency) regions,
among which only a narrow band from about 400 to 700 nm is visible to the human eyes. Note
that there is no sharp boundary between these regions. The boundaries shown in the above
figures are approximate and there are overlaps between two adjacent regions.

Wavelength units: 1 mm = 1000 µm; 1 µm = 1000 nm.

• Radio Waves: 10 cm to 10 km wavelength.

• Microwaves: 1 mm to 1 m wavelength. The microwaves are further divided into different


frequency (wavelength) bands: (1 GHz = 109 Hz)

• P band: 0.3 - 1 GHz (30 - 100 cm)

• L band: 1 - 2 GHz (15 - 30 cm)

• S band: 2 - 4 GHz (7.5 - 15 cm)

• C band: 4 - 8 GHz (3.8 - 7.5 cm)

• X band: 8 - 12.5 GHz (2.4 - 3.8 cm)

• Ku band: 12.5 - 18 GHz (1.7 - 2.4 cm)

• K band: 18 - 26.5 GHz (1.1 - 1.7 cm)

• Ka band: 26.5 - 40 GHz (0.75 - 1.1 cm)

• Infrared: 0.7 to 300 µm wavelength. This region is further divided into the following
bands:

• Near Infrared (NIR): 0.7 to 1.5 µm.

• Short Wavelength Infrared (SWIR): 1.5 to 3 µm.

• Mid Wavelength Infrared (MWIR): 3 to 8 µm.

• Long Wavelength Infrared (LWIR): 8 to 15 µm.

• Far Infrared (FIR): longer than 15 µm.


The NIR and SWIR are also known as the Reflected Infrared, referring to the main
infrared component of the solar radiation reflected from the earth's surface. The MWIR
and LWIR are the Thermal Infrared.

• Visible Light: This narrow band of electromagnetic radiation extends from about 400 nm
(violet) to about 700 nm (red). The various color components of the visible spectrum fall
roughly within the following wavelength regions:

• Red: 610 - 700 nm

• Orange: 590 - 610 nm

• Yellow: 570 - 590 nm

• Green: 500 - 570 nm

• Blue: 450 - 500 nm

• Indigo: 430 - 450 nm

• Violet: 400 - 430 nm

• Ultraviolet: 3 to 400 nm

• X-Rays and Gamma Rays

Blue, green, and red are the primary colors or wavelengths of the visible spectrum. They are
defined as such because no single primary color can be created from the other two, but all other
colours can be formed by combining blue, green, and red in various proportions.

The visible portion of this radiation can be shown in its


component colors when sunlight is passed through a prism, which bends the light in differing
amounts according to wavelength.
The infrared (IR) region covers the wavelength range from approximately 0.7 µm to 100 µm -
more than 100 times as wide as the visible portion! The infrared region can be divided into two
categories based on their radiation properties - the reflected IR, and the emitted or thermal IR.
Radiation in the reflected IR region is used for remote sensing purposes in ways very similar to
radiation in the visible portion. The reflected IR covers wavelengths from approximately 0.7 µm
to 3.0 µm. The thermal IR region is quite different than the visible and reflected IR portions, as
this energy is essentially the radiation that is emitted from the Earth's surface in the form of heat.
The thermal IR covers wavelengths from approximately 3.0 µm to 100 µm.

The portion of the spectrum of more recent interest to remote sensing is the microwave region
from about 1 mm to 1 m. This covers the longest wavelengths used for remote sensing. The
shorter wavelengths have properties similar to the thermal infrared region while the longer
wavelengths approach the wavelengths used for radio broadcasts.

2.8.3 Radiation- Target Interactions

Radiation that is not absorbed or scattered in the atmosphere


can reach and interact with the Earth's surface. There are three (3) forms of interaction that can
take place when energy strikes, or is incident (I) upon the surface. These are: absorption (A);
transmission (T); and reflection (R). The total incident energy will interact with the surface in
one or more of these three ways. The proportions of each will depend on the wavelength of the
energy and the material and condition of the feature.

Absorption (A) occurs when radiation (energy) is absorbed into the target while transmission
(T) occurs when radiation passes through a target. Reflection (R) occurs when radiation
"bounces" off the target and is redirected. In remote sensing, we are most interested in measuring
the radiation reflected from targets. We refer to two types of reflection, which represent the two
extreme ends of the way in which energy is reflected from a target: specular reflection and
diffuse reflection.
When a surface is smooth we get specular or mirror-like reflection where all (or almost all) of
the energy is directed away from the surface in a single direction. Diffuse reflection occurs when
the surface is rough and the energy is reflected almost uniformly in all directions. Most earth
surface features lie somewhere between perfectly specular or perfectly diffuse reflectors.
Whether a particular target reflects secularly or diffusely, or somewhere in between, depends on
the surface roughness of the feature in comparison to the wavelength of the incoming radiation.
If the wavelengths are much smaller than the surface variations or the particle sizes that make up
the surface, diffuse reflection will dominate. For example, fine grained sand would appear fairly
smooth to long wavelength microwaves but would appear quite rough to the visible wavelengths.

Let's take a look at a couple of examples of targets at the Earth's surface and how energy at the
visible and infrared wavelengths interacts with them

Leaves: A chemical compound in leaves called chlorophyll


strongly absorbs radiation in the red and blue
wavelengths but reflects green wavelengths. Leaves appear "greenest" to us in the summer, when
chlorophyll content is at its maximum. In autumn, there is less chlorophyll in the leaves, so there
is less absorption and proportionately more reflection of the red wavelengths, making the leaves
appear red or yellow (yellow is a combination of red and green wavelengths). The internal
structure of healthy leaves act as excellent diffuse reflectors of near-infrared wavelengths. If our
eyes were sensitive to near-infrared, trees would appear extremely bright to us at these
wavelengths. In fact, measuring and monitoring the near-IR reflectance is one way that scientists
can determine how healthy (or unhealthy) vegetation may be.

Water: Longer wavelength visible and near infrared radiation is


absorbed more by water than shorter visible wavelengths. Thus water typically looks blue or
blue-green due to stronger reflectance at these shorter wavelengths, and darker if viewed at red
or near infrared wavelengths. If there is suspended sediment present in the upper layers of the
water body, then this will allow better reflectivity and a brighter appearance of the water.

By measuring the energy that is reflected (or emitted) by targets on


the Earth's surface over a variety of different wavelengths, we can build up a spectral response
for that object. By comparing the response patterns of different features we may be able to
distinguish between them, where we might not be able to, if we only compared them at one
wavelength. For example, water and vegetation may reflect somewhat similarly in the visible
wavelengths but are almost always separable in the infrared. Spectral response can be quite
variable, even for the same target type, and can also vary with time (e.g. "greenness" of leaves)
and location. Knowing where to "look" spectrally and understanding the factors which influence
the spectral response of the features of interest are critical to correctly interpreting the interaction
of electromagnetic radiation with the surface.
CHAPTER THREE
THE OVERALL BENEFITS OF THE PROJECT
After we joined the institute for the internship program, we were able to develop different skills
in:

3.1 Theoretical knowledge:


We were able to know deeply what is photogrammetry, Definition and types of
photogrammetry, The materials used in photogrammetry, With which other disciplines
photogrammetry works, What are the pre stages for photogrammetric applications, What
things shall be considered in aerial photograph corrections, What type of photogrammetry we
shall use in digital photogrammetry i.e. Vertical photogrammetry, For what purpose do they
use Oblique photogrammetry, For what purpose do they use Oblique photogrammetry, The
time it was taking in Analogue and Analytical photogrammetry for the production of
Orthophotos and Maps and Why is photogrammetry used

3.2 Practical knowledge:


• We were able to use ERDAS IMAGINE software in the production of Orthophoto

• We used ArcGIS software in extraction of Topographic features in the Orthophoto

• We were able to be familiar with the above two software


• Get knowledge about different sections of the software

• Be familiar with different types of shortcuts in the software

• Building 3D features

• Animating features and recording video of the animation using ArcGIS software

• We got new knowledge in creating new project area

• Defining the camera model

• Adding the imagery to the block file

• Generating automatic tie points

• Performing Interior and Exterior orientation

• Triangulating images

• DTM and Contour generation

• Orth rectification of images

• Performing Mosaic etc.

3.3 The benefits we got regarding in terms of interpersonal communication skills,


improving team playing skills, leadership skill, understanding work ethics and
entrepreneurship skills
Above all what are listed above we were able to increase our interpersonal communication skills
which the internship led us to do so. In this internship program we were able to help each other
in every aspect of the program like software usage, sharing knowledge and etc.

This internship program made us know the role of individuals in a group, made us help each
other as a group member.

This internship made us know different work ethics like: punctuality, honesty, patience, self-
discipline, to have commitment to every work tasks, willingness to learn, being initiative, to be
loyalty, maximizing productivity etc.
Additionally the internship program made us face any problems that arise in any working place
and helped us to bring solutions for each problem that will come in every working place.

CHAPTER FOUR
THE DIGITAL PHOTOGRAMMETRY TASK

4.1 INTRODUCTION
Digital photogrammetry is photogrammetry applied to digital images that are stored and
processed on a computer.
Digital photogrammetry is sometimes called softcopy photogrammetry. Digital images can be
scanned from photographs or directly captured by digital cameras.

Many photogrammetric tasks can be highly automated in digital photogrammetry (e.g., automatic
DEM extraction and digital orthophoto generation).
The output products are in digital form, such as digital maps, DEMs, and digital orthophotos
saved on computer storage media.

Single or pairs of digital images are loaded into a computer with image processing
capabilities.Images may be from satellite or airborne scanners, CCD cameras or are conventional
photographs captured by a line scanner.
Digital photogrammetric systems use digitized photographs or digital images as the primary
source of input.
Digital imagery can be obtained in various ways, including:
• digitizing existing hardcopy photographs

• using digital cameras to record imagery

• using sensors onboard sat

ellites such as Landsat, SPOT, and IRS to record imagery

4.1.1 Types of photogrammetry


Based on the camera location photogrammetry can be divided into two
1. Aerial photogrammetry
2. Close-range photogrammetry

Aerial photogrammetry

The camera is mounted in an aircraft and is usually pointed vertically towards the ground
Multiple overlapping photos of the ground are taken as the aircraft flies along a flight path.

Close-range photogrammetry

The camera is close to the subject and is typically hand held or on a tripod Usually this type of
photogrammetry work is non-topographic

That is the output is not topographic products like terrain models or topographic maps, but
instead drawings and 3d models Everyday cameras are used to model buildings, engineering
structures, vehicles, forensic and accident scenes, film sets, etc.

4.1.2 Types of photographs


Vertical photograph

• Camera axis is < 3° from verticalare commonly used for topographic and planimetric
mapping projects, and are commonly

• captured from an aircraft or satellite

• The camera is mounted in an aircraft and is usually pointed vertically towards the ground
• Multiple overlapping photos of the ground are taken as the aircraft flies along a flight
path

Oblique Photograph

Camera axis is > 3° from vertical


Low oblique—no horizon

High oblique—includes horizon


Single frame orthorectification
Techniques, orthorectify one image at a time using a technique known as space resection In this
respect, a minimum of three GCPs is required for each image. For example, in order to
orthorectify 50 aerial photographs, a minimum of 150 GCPs is required.Since we have three
photographs we will have six GCP points based on this rule.

Block triangulation
It is the process of establishing a mathematical relationship between the images contained in a
project, the camera or sensor model, and the ground
The information resulting from aerial triangulation is required as input for the orthorectification,
DEM creation, and stereo pair creation processes. The term aerial triangulation is commonly
used when processing aerial photography and imagery
The term block triangulation, or simply triangulation, is used when processing satellite imagery

4.2 Methodology
4.2.1 Type of Data used in the project
Aerial photographs
We have used three aerial photographs i.e. vertical photographs of the area Bole-Bulbula of
the year 2002, scanned from dia-positive Strip 1:3 photographs of 60%
overlap(4094,4093,4092) The flight direction was reverse direction.
Figure :- Aerial photographs
Ground Surveying
Surveying levels, total stations and ground GPS units can be used for the measurement of 3D
information pertaining to the Earth's surface
All of the 3D points are used to interpolate a 3D surface of the specific area of interest

Table :-The three GPS points


GCP (Ground Control Point)

Ground Control Points (GCP) are determined either by conventional survey, published maps or
by aero triangulation. Depending on the type of algorithmic correction to be used, a minimum of
3 to 5 GCPs must be established. The relationship between x-y photo-coordinates and real world
GCP coordinates is then used to determine the algorithm for resampling the image.
We have six GCP points for the three photographs for the reason to be stated above.
Table :-The GCP points
4.2.1 Types of soft wares used in the project
• We have used ERDAS 15 software to produce the Orthophoto and

• Arc GIS software to produce the 2d map of the area by digitizing the Orthophoto.

The procedures we have been using in performing the work task

Figure :- The procedures we have been using in performing the work task
Figure :- creation of new project area

Figure :- defined camera model


4.3 Interior Orientation

Defines the internal geometry of a camera or sensor as it existed at the time of image capture.The
variables associated with image space are defined during the process of defining interior
orientation.It is primarily used to transform the image pixel coordinate system or other image
coordinate measurement system to the image space coordinate system.

The internal geometry of a camera is defined by specifying the following variables:

• Principal point

• Focal length

• Fiducial marks

• Lens distortion

4.3.1 Principal Point and Focal Length


The principal point is mathematically defined as the intersection of the perpendicular line
through the perspective center of the image plane. The length from the principal point to the
perspective center is called the focal length.

4.3.2 Lens Distortion


It deteriorates the positional accuracy of image points located on the image plane.

4.3.3 Fiducial Marks


One of the steps associated with calculating interior orientation involves determining the image
position of the principal point for each image in the project. The image positions of the fiducial
marks are measured on the image, and then compared to the calibrated coordinates of each
fiducial mark.

Since the image space coordinate system has not yet been defined for each image, the measured
image coordinates of the fiducial marks are referenced to a pixel or file coordinate system. The
pixel coordinate system has an x coordinate (column) and a y coordinate (row). The origin of the
pixel coordinate system is the upper left corner of the image having a row and column value of 0
and 0, respectively.

Fiducial orientation

Fiducial orientation defines the relationship between the image/photo-coordinate system of a


frame and the actual image orientation as it appears within a view.The image/photo-coordinate
system is defined by the camera calibration information.The orientation of the image is largely
dependent on the way the photograph was scanned during the digitization stage.In this project
the photographs were taken in reverse direction so the image will be rotated 180° relative to the
photo-coordinate system.

4.3 Exterior Orientation


It defines the position and angular orientation of the camera that captured an image.

The variables defining the position and orientation of an image are referred to as the elements of
exterior orientation. Define the characteristics associated with an image at the time of exposure
or capture. The positional elements of exterior orientation include Xo, Yo, and Zo. They define
the position of the perspective center (O) with respect to the ground space coordinate system (X,
Y, and Z). Zo is commonly referred to as the height of the camera above sea level, which is
commonly defined by a datum.The angular or rotational elements of exterior orientation describe
the relationship between the ground space coordinate system (X, Y, and Z) and the image space
coordinate system (x, y, and z). Three rotation angles are commonly used to define angular
orientation.
They are omega (ω), phi (ϕ), and kappa (κ).
Omega is a rotation about the photographic x-axis, phi is a rotation about the photographic y-
axis, and kappa is a rotation about the photographic z-axis. Rotations are defined as being
positive if they are counter-clockwise when viewed from the positive end of their respective axis.
Different conventions are used to define the order and direction of the three rotation angles.

Figure :- The exterior orientation

Figure :- Output of the added imagery


4.5 GCP (Ground Control Point)
The instrumental component of establishing an accurate relationship between the images in a
project, the camera/sensor, and the ground is GCPs. GCPs are identifiable features located on the
Earth’s surface whose ground coordinates in X, Y, and Z is known.

The following features on the Earth’s surface are commonly used as GCPs:

• Intersection of roads

• Utility infrastructure (e.g., fire hydrants and manhole covers)

• Intersection of agricultural plots of land

• Survey benchmarks

4.6 Tie Points


It is a point whose ground coordinates is not known, but is visually recognizable in the overlap
area between two or more images. The corresponding image positions of tie points appearing on
the overlap areas of multiple images is identified and measured.

Ground coordinates for tie points are computed during block triangulation. Nine tie points in
each image are adequate for block triangulation. Based on this we have created 60 tie points

4.7 Generate and Edit DTM


A digital terrain model is a topographic model of the bare earth –terrain relief - that can be
manipulated by computer programs.

The data files contain the spatial elevation data of the terrain in a digital format which usually
presented as a rectangular grid, Vegetation, buildings and other man-made (artificial) features are
removed digitally.

Digital Surface Model (DSM) is usually the main product produced from photogrammetry,
where it does contain all the features both man-made and natural features mentioned above,
while a filtered DSM results in a DTM).A digital terrain model (DTM) is a 3D digital
representation of the Earth's terrain or topography.
Automatic DTM extraction involves the automatic extraction of elevation information from
imagery and the subsequent creation of a 3D digital representation of the Earth's surface.DTM
represents the elevation associated with the Earth's topography and not necessarily the human-
made (e.g., buildings) or natural (e.g., trees) features located on the Earth’s surface.

A digital surface model (DSM) represents the elevation associated with the Earth's surface
including topography and all natural or human-made features located on the Earth’s surface. The
primary difference between a DSM and a DTM is that the DTM represents the Earth’s terrain
whereas a DSM represents the Earth's surface.

DTMs can only be extracted if two or more overlapping images are available. Prior to the
automatic extraction of DTMs, sensor model information associated with an image must be
available. This includes internal and external sensor model information.
Figure :- DTM extraction and contour

4.8 Orthorectification
It is the process of reducing geometric errors inherent within photography and imagery. An
image or photograph with an orthographic projection is one for which every point looks, as if an
observer were looking straight down at it, along a line of sight that is orthogonal (perpendicular)
to the Earth.
The resulting orthorectified image is known as a digital orthoimage.

A few points of geo-referencing are not sufficient for precise spatial GIS files, particularly in
steep and undulating terrain. Topography can distort the photos. A digital elevation model
(DEM) has therefore to be superimposed on the photos, which can be done only with specific
software. Normally, this process requires a few ground control points (GCP).
The outputs of this process are Orthophotos, where every location of the photos is geographically
correct and located with its coordinates.

In flat areas, it might be sufficient to have more GCPs, but there is no orthorectification process
required. Quality has to be carefully checked in all areas of the photo.
Satellite image
Images taken by satellites (Quick Bird, GeoEye, Ikonos etc.) follow a very similar process. They
are always dependent on scanners (i.e. digital) and always georeferenced (sometimes called
‘georectified’ by the imagery providers).
Figure :- Orthorectification of the images

Figure :- Sub setting the orthophotos


4.5 Creation of Mosaic
The process of mosaic construction involves:

Assembling the mosaic on a mosaic board, Size of board is limited only by convenience and
availability of space. Assembling the photos starting from the center of the board by pricking the
radial control points.Slight touchup or blending to achieve a happy medium of tones in the final
mosaic. Cartographic treatment to annotate and describe information including boarder
information, grid designation, scale, author, title, north arrow, sheet layout, and body
information.
Figure :- The Mosaic process

Figure :- subsetting the mosaic


4.10 THE RESULT
4.10.1 ORTHOPHOTO MAPS
Orthophoto is a photographic re-projection, prepared from a photograph, in which displacement
due to tilt and terrain displacement have been removed so that it has the same properties as a map
including a known scale. It is a photo that shows images of objects in their true orthographic
positions.

4.10.2 Uses of Orthophoto


For urban planning and management: This is because they are much quicker to produce, since
urban data must frequently be updated. Engineering design and surveying.

It can be used as maps for making direct measurements of distances, angles, positions, areas and
volumes.Resources and Environmental management: like forest coverage and extent, agricultural
practices e.g. mode of farming, settlements and their direction of expansion; waste management
etc. is easily handled by orthophoto.

Military use: this helps in military planning of combat affairs since nature of topography,
heights, coordinates, distances and angles for trajectories can be executes precisely.

• Regional and settlement planning

• Architectural planning

• Route location and planning, as substitute for map.

• Other specific uses include:

• Creation and updating of a Geographic Information System (GIS).

• Used as base maps for other features

• Parcel mapping: “Orthophoto make excellent bases for cadastral and tax mapping

• Dispute resolution:

• Used to produce cadastral plans in areas where photogrammetric applications are


preferred
• Property valuing

• Land registration

• Planning

• Detection of land use and land cover change

• Development of Land Information Systems (LIS)

Orthophoto production is based on the principle of differential rectification, which involves


dividing the photograph into many tiny surfaces, patches known as differential elements. Each of
these elements is rectified independently to a common scale. As a result, image tilt due to
photographic tilt and terrain relief will be eliminated. The perspective projection, of the
photograph will be changed to orthogonal projection hence an orthophotograph.

Figure :- The Orthophoto


CHAPTER FIVE
CONCLUSION AND RECOMMENDATION

5.1 Conclusion
This internship was aimed to enable students practice in industries so that they get technical and
operational knowledge in various section of industry. It’s obvious that this institute i.e. GII is one
of the biggest industries regarding photogrammetry in Ethiopia, so we applied to get practiced
knowledge in the institute regarding photogrammetry. As result we have got accepted and
practiced well.

5.2 Recommendation
After the problems we have faced in the institution during our internship exercise period as a
trainer we are faced different problems and we are recommended the institute to having an
updated software used to train the students instead of manual experiencing and give the
accessibility of internet or wifie connections for the internshipers. .
References
[1]. Mr Mesfin: trainer of GPS/GNSS, phone number +2519119296421

[2].Mr Eskender: trainer of remote sensing, phone number +251913575369

[3].Mr lemi Elala: internship coordinator, phone number +251921465075

[4].Mr. Yohannes the advisor: additional orientations and knowledge Phone number +251-91-
341-7966

[5].Google search: for terminolog


Appendices
Definition of terms used in the document

GIS: A computer-based system to aid in the collection, maintenance, storage, analysis, output,
and distribution of spatial data and information.

DEM: is a 3D representation of terrain’s surface commonly of planet.

DSM: used in landscape modeling, city modeling and visualization applications.

DTM: required for flood or drainage modeling, land use studies, geological applications, and
other applications and in a planetary science.

2D: Typically applied to GIS applications that view their phenomena in two-dimensional space
where coordinates are pairs (x, y).

3D: Typically applied to GIS applications that view their phenomena in three-dimensional space,
where coordinates are triplets (x, y, and z).

You might also like