Professional Documents
Culture Documents
A Proposed Method of Systematic Geometric Correction For LAPAN-A4 Satellite Data
A Proposed Method of Systematic Geometric Correction For LAPAN-A4 Satellite Data
CrossMark
View Export
Online Citation
Abstract. Indonesia's fourth microsatellite, LAPAN-A4, is planned to be launched at the end of 2022. The satellite will
carry nine payloads, two of which are line imager sensors named SLIM4 and ELLISA. The payloads have four
multispectral bands, which are R, G, B, and NIR. In order to get an accurate image, a correct and precise image
processing technique is required. One of these image processing steps is a geometric correction. Geometric correction or
georeferencing is the process of associating the image geometry to the actual magnitude and units on earth, which is done
by equating the position of the target location in the image with the precise location on earth. There are two types of
georeferencing, the rigorous sensor model (physical model) and the generalized sensor model (empirical model). The
rigorous sensor model (RSM) is believed to be the most reliable method that can be applied to the overall image
generated by the imaging system. It is the best choice for areas where in situ ground control point (GCP) measurements
are not possible. It has been applied in SPOT-4,5,6,7 and Landsat-8 satellite data. This paper proposes a method of
systematic geometric correction for LAPAN-A4 satellite data generated by SLIM4 and ELLISA sensors.
INTRODUCTION
The development of remote sensing technology currently allows the relatively high availability of image data with
various formats and different processing levels [1]. The various types of sensors developed for satellite-based
remote sensing and the increased availability of the data have led to the growing use of satellite imagery data for
various applications, such as agriculture, forestry, environmental monitoring, urban planning, military,
transportation, etc. However, the utilization of this satellite image data depends on several factors, such as sensor
characteristics (geometric and radiometric qualities), the quality of the processing software used, the types of image
products available, and the quality of the final images. One of the primary obstacles that greatly affects the
utilization of satellite imagery is the sensor model, whether the sensor model used can provide a high level of
geometric correction through the orientation of the resulting image or not.
The 9th International Seminar on Aerospace Science and Technology – ISAST 2022
AIP Conf. Proc. 2941, 030034-1–030034-8; https://doi.org/10.1063/5.0181438
Published by AIP Publishing. 978-0-7354-4755-4/$30.00
030034-1
The sources of distortion that affect the quality of the image acquired by remote sensing satellites can generally
be categorized into two types, distortion from the acquisition system and distortion from the atmospheric refraction
[2]. The distortion of the acquisition system includes the orientation and the movement of the platform, as well as
the optical-geometric characteristics of the imaging sensor, whereas the distortion of atmospheric refraction is
related to the large number of particles in the atmosphere that refract the light. Distortions caused by the atmospheric
refraction are usually neglected for operation around nadir pointing because their effect is relatively small on the
geometric quality of the resulting image. Noerdlinger (1999) shows that atmospheric refraction causes ground
displacements of 0.55, 1.22, and 2.22 m for respective 10°, 20°, and 30° off-nadir angles using a very approximate
atmosphere model [3]. Meanwhile, the distortion caused by the acquisition system is quite influential on the
geometric quality of the image. For this reason, an appropriate orientation-based sensor model, also known as the
geometric model, is required for the systematic geometric correction of satellite images. This modeling is a
fundamental step in photogrammetric processing and very important for analysis and multi-source data fusion.
Geometric modeling describes the relationship between image and ground coordinates for a given sensor.
Various methods have been developed to formulate the mathematical relations between image and ground
coordinates [4]. In principle, there are two different types of orientation-based sensor model, namely the physical
sensor model (also called the rigorous sensor model) and the generalized sensor model (also called the empirical
model or the rational function model) [2]. In the physical model, the image is correlated with the ground coordinates
by using the collinearity equations, and each involved parameter has a physical meaning. This method requires
knowledge of the specific satellite, orbit, and sensor characteristics. On the contrary, the empirical model or rational
function model (RFM) links the image and the terrain coordinates by using the rational polynomial coefficients
(RPC). This model is usually based on the rational polynomial functions (RPF) and does not need knowledge about
sensor and acquisition features [2].
The empirical method formulates mathematical relations through statistical matching or fitting of several
coordinates of image points to known coordinates of their paired points on the earth's surface. The mathematical
relations built are generally in the form of polynomials which is then used to calculate all other image points in the
030034-2
have not been carried out further for the in-house LAPAN satellites. Previous researches conducted by formerly
Remote Sensing Technology and Data Center, LAPAN (now Research Center for Remote Sensing, BRIN) tried to
implement the RSM for its in-house aerial camera (Inderaku-2A) [12, 13]. Moreover, formerly Satellite Technology
Center, LAPAN (now Research Center for Satellite Technology, BRIN) has done several researches on systematic
image preprocessing for LAPAN-A3 satellite data with pushbroom imager’s specification of 15-meter resolution
and 120-kilometer swath-width. These researches have yielded some georeferencing methods resulting in the mean
accuracy of 3.80 km in the across-track and 10.69 km in the along-track direction using GPS data and satellite
attitude data as well as the mean accuracy of 781 m in the across-track direction of red-band image only using image
matching method [14, 15, 16]. The results of the georeferencing methods developed for LAPAN-A3 satellite data
are not as good as the accuracy of SPOT satellite image data using the RSM method. Hence, for the LAPAN-A4
satellite, which is planned to carry several pushbroom sensors, the geometric processing model that can produce
better results than LAPAN-A3 is needed to accommodate the characteristics of these sensors.
This paper only focuses on the main pushbroom sensor, ELLISA, carried by LAPAN-A4 satellite and the
proposed systematic geometric correction based on RSM.
LAPAN-A4 SATELLITE
LAPAN-A4 satellite is the fourth microsatellite developed by formerly Satellite Technology Center, LAPAN (now
Research Center for Satellite Technology, BRIN) which is planned to be launched at the end of 2022. It is designed
to be in the sun synchronous polar orbit at an altitude of 500 km. The satellite has missions for earth observation,
ship traffic monitoring, space magnetic field measurements, and thermal infrared acquisition [17]. Payloads carried
by this satellite consist of two types, non-optical payloads and optical payloads.
There are four non-optical payloads consisting of Automatic Identification System (AIS), Hybrid Fluxgate
Magnetometer (HFGM), Internet of Things (IoT), and stepper motor for payload. As for the optical payloads, there
are five instruments, namely Medium Resolution Imager – STTL Line Imager 4-band (SLIM4), Experimental
ELLISA is an imager developed by formerly LAPAN (now BRIN). It has medium resolution (ELLISA-MR) and
high resolution (ELLISA-HR). The ELLISA-HR has three multispectral bands, which are Red (R), Near Infrared
(NIR), and Panchromatic (PAN). While the ELLISA-MR, similar to SLIM4, has four multispectral bands, namely
Red (R), Green (G), Blue (B), and Near Infrared (NIR).
Based on the standard photogrammetric approach, the rigorous sensor model (RSM) uses collinearity equations to
describe the physical-geometrical image acquisition. This model tries to describe the physical properties of the
image acquisition through the relationship between image and ground coordinate system. Since the sensor is a
pushbroom-type, the image is formed by many individual lines. Each line is acquired with proper position and
attitude values, which is the result of perspective projection. Therefore, the mathematical model is based on
collinearity equations. All positions during image acquisition are related by the orbital dynamics. Thus, the RSM is
based on the reconstruction of the orbital segment during image acquisition through the knowledge of the internal
orientation (the acquisition mode and the sensor parameters) and external orientation (the satellite position and
030034-3
attitude parameters). All the parameter values can be approximated by computation using information contained in
the image metadata file and then corrected by Least Square estimation processes. In order to link the image to the
ground coordinates in an Earth-Centered Earth-Fixed (ECEF) reference frame, a set of translation and a set of
rotation matrices as well as sensor attitudes are required. To determine the object location on the ground, several
transformations are conducted, from image coordinate system to detector coordinate system, from detector to
camera coordinate systems, from camera to the Earth coordinate systems, and from the Earth to the object
coordinate systems.
In principle, direct georeferencing is conducted to estimate the ground coordinates of the homologous points
measured in the image through the forward intersection using the internal orientation (the results of
calibration/laboratory measurements) and the external orientation (position and attitude data provided by the
geoposition/geolocation system carried by the platform). The homologous points are the points that are
simultaneously formed in an image acquisition process (snapshot). For the pushbroom sensor, based on the detection
array arrangement, the homologous points are a row of pixels that form an image line. The direct georeferencing
approach does not require any ground control points (GCP), except for final validations, and does not need to
estimate any additional parameters that model the interior and exterior orientations. For this reason, the effectiveness
and reliability of direct georeferencing methods based on the RSM are dependent on the accuracy of the internal and
external orientations of data provided. This method can be formulated in a vector relation that represents the
geometric formation of the center of the earth and the viewing center between the sensor and the object point (see
Fig. 1).
The viewing point is the intersection point of the extension of the image pixel's vector view with the earth
model's surface.
⃗⃗⃗⃗⃗⃗
𝑂𝑀 = ⃗⃗⃗⃗⃗ ⃗⃗⃗⃗⃗⃗ or 𝒔 − 𝒈 = 𝜇𝒊
𝑂𝑃 + 𝑃𝑀 (1)
Eq. (1) consists of g which is the position vector of ground coordinate, s which is the position vector of sensor
when acquiring image, and i which is the unit vector of pixel viewing to earth. For computational reason, Eq. (1) is
written in the form of vector element matrix:
030034-4
𝑋𝑃 − 𝑋 𝑋𝑖
[ 𝑌𝑃 − 𝑌 ] = 𝜇 ∗ [ 𝑌𝑖 ] (2)
𝑋𝑃 − 𝑋 𝑍𝑖
with 𝜇 as a scaling factor that scale up vector of pixel viewing to earth model.
As a part of the earth’s ellipsoid, the elements of position vector of M point (X, Y, Z) also satisfy equation of the
earth’s ellipsoid:
𝑋 2 +𝑌 2 𝑍2
(𝑎+ℎ)2
+ (𝑏+ℎ)2 = 1 (3)
with 𝑎 as the earth’s major axis (semi major axis), 𝑏 as the earth’s minor axis (semi minor axis), and ℎ as the height
of M point from the earth model surface.
The relation mentioned above is a simplification by assuming P as the satellite center point (the center of mass)
which coincides with the center of view of the sensor system. In practice, P cannot always coincide perfectly with
the center of view of the sensor, and the satellite position sensor cannot always coincide with the center of mass of
the satellite either. Therefore, the direct georeferencing model must carefully identify the location of these points
and then determine the vector precisely. A translation matrix connects the coordinate system of each of these points
to a coordinate system centered on the sensor's center of view (the origin of the view vector i), as well as the
construction in the sensor system that builds the internal view vector of the image pixels.
Image Orientation
The systematic geometric correction based on the RSM uses image orientation to derive the collinearity equations.
Image orientation is a viewing direction or pixel viewing direction that consists of the internal orientation, the
The internal orientation or the internal viewing direction is a viewing direction from the image which is built by
the image formation mechanism of the optical system inside the camera.
030034-5
0
𝑁𝑝
𝑔𝑘 = [(𝑣 −
⃗⃗⃗⃗ 2
) 𝑦𝑝 ] (4)
𝑓
0
𝑁𝑝
⃗⃗⃗⃗𝑘 = [( 2 − 𝑣) 𝑦𝑝 ]
𝑔𝑏𝑘 = −𝑔
⃗⃗⃗⃗⃗⃗ (5)
−𝑓
In the equation, gbk is the internal viewing direction of image pixel G to the object on the ground according to the
camera coordinate system, f is the focal length of the camera lens, Np is the number of detector cells in an array, yp is
the detector pitch, and v is the column number of the image line.
The external orientation is all actions that cause the internal orientation to change from the original direction,
such as the tilt of the camera when it was installed on the satellite body and the attitudes of the satellite body (pitch,
yaw, roll) during image acquisition [19].
cos Ѱ cos Κ sin Ω sin Ѱ cos Κ + cos Ω sin Κ − cos Ω sin Ѱ cos Κ + sin Ω cos Κ
𝐓𝑜𝑏𝑗𝑒𝑐𝑡−𝑠𝑎𝑡𝑒𝑙𝑙𝑖𝑡𝑒 = [− cos Ѱ sin Κ − sin Ω sin Ѱ sin Κ + cos Ω cos Κ cos Ω sin Ѱ sin Κ + sin Ω cos Κ ] (6)
sin Ѱ − sin Ω cos Ѱ cos Ω cos Ѱ
with , K, and Ω are the values of pitch, yaw, and roll of the satellite. These values can be obtained from the
satellite ephemeris data. It is a transformation from the satellite orientation to the local orbital orientation.
The final orientation is the viewing direction of image pixel in the earth reference system (the earth coordinate
system), a transformation from the orbital frame to Earth-Centered Earth-Fixed (ECEF).
⃗ 𝑜𝑏𝑗𝑒𝑐𝑡 ×𝑍𝑜𝑏𝑗𝑒𝑐𝑡
𝑌
𝑋̂𝑜𝑏𝑗𝑒𝑐𝑡 = ⃗ 𝑜𝑏𝑗𝑒𝑐𝑡 ×𝑍𝑜𝑏𝑗𝑒𝑐𝑡 ‖
(8)
‖𝑌
⃗ (𝑡)
𝑍𝑜𝑏𝑗𝑒𝑐𝑡 ×𝑉
𝑌̂𝑜𝑏𝑗𝑒𝑐𝑡 = ⃗ (𝑡)‖
(9)
‖𝑍𝑜𝑏𝑗𝑒𝑐𝑡 ×𝑉
𝑆(𝑡)
𝑍̂𝑜𝑏𝑗𝑒𝑐𝑡 = (10)
‖𝑆(𝑡)‖
030034-6
DEVELOPMENT OF DIRECT GEOREFERENCING MODEL OF LAPAN-A4
To develop the direct georeferencing model based on the rigorous sensor model (RSM), the collinearity equations
are required. The formulation of a mathematical relation that expresses the collinearity of the pixel position vector
element in the image coordinate system relating to the object position vector element on the earth in the earth
coordinate system is carried out through the following steps. The first step is the formulation of the viewing vector
(the viewing direction) of pixels according to the camera coordinate system (the formulation of the internal
orientation). The next step is performing sequential transformations of the internal orientation into the viewing
vector according to a reference frame centered at the center of the earth by involving camera attitude data and all the
related coordinate systems. The last step is to intersect (forward intersection) the image pixel viewing vector with
the earth’s ellipsoid model to determine the position vector that represents the location of the image pixels in the
earth coordinate system. All the steps should be arranged, and the necessary data should be provided in order to
implement this model in an algorithm flow.
The steps to formulate the collinearity equations are as follows. First step is to define all related coordinate
systems, which are the image system, the sensor/detector system, the body/camera system, the flight/satellite system,
the orbital system, the Earth Centered Inertial (ECI) system, the Earth-Centered Earth-Fixed (ECEF) system, and the
geodetic local system. The next step is to define orbital parameters of the satellite orbit, the Keplerian elements that
consist of semi-major axis, orbit inclination, right ascension of the ascending node, eccentricity, true anomaly,
argument of the perigee, and time of the perigee passage. The approximate values of the Keplerian elements can be
computed based on the ephemeris information in the metadata file. Afterwards, the attitude angles of the sensor
during acquisition are defined. The information required are the pitch, yaw, and roll angles that can be approximated
by calculating the data in the metadata file. The approximate values can be corrected by modeling the data with the
second polynomial order or another polynomial order. The polynomial order can be determined by performing
experiments. Finally, defining the system coordinate transformation is performed.
To enforce a direct georeferencing model based on the RSM for LAPAN-A4 satellite, several data or
CONCLUSION
The rigorous sensor model (RSM) is the most widely used direct georeferencing method because it does not require
any ground control points except for evaluating the final results. The RSM produces a one-by-one mapping
relationship that is defined with certainty due to the position and attitude information of the sensor at the time of
imaging. This method has been applied for standard geometric processing of various pushbroom-type remote
sensing sensors, such as SPOT-4,5,6,7 and Landsat-8. In the SPOT-6/7, the performance of the model used can
provide location accuracy of up to 20 meters, exceeding the design performance of 50 meters.
It requires some data to be able to apply this method on the LAPAN-A4 satellite, i.e., the technical data for
internal orientation, the technical data to determine the external orientation by design, sensor attitude (pitch, yaw,
and roll), satellite’s ephemeris data, images and metadata, and also scanning repetition.
The proposed method in this paper can be tested using data LAPAN-A3. There are simulation procedures that
can be carried out: 1) performing model adaptation to the physical sensor construction of the LAPAN-A3, 2)
calculating the earth coordinates of several LAPAN-A3 image pixels using the developed model for coastal,
lowland, and highland regions, and 3) comparing the location coordinates between computational model results, the
field measurements, and the SPOT-4/5 standard image product.
ACKNOWLEDGMENTS
The authors would like to thank Mr. Suhermanto as the Mentor of WBS1 for the support given during this research.
030034-7
REFERENCES
1. A.P. Cracknell, Int. J. Remote Sens. 39, 8387 (2018). doi: 10.1080/01431161.2018.1550919
2. T. Toutin, Int. J. Remote Sens. 25, 1893 (2004). doi: 10.1080/0143116031000101611
3. P.D. Noerdlinger, ISPRS J. Photogramm. Remote Sens. 54, 360 (1999). doi: 10.1016/S0924-2716(99)00030-1
4. A. Habib, S.W. Shin, K. Kim, C. Kim, K. Bang, E. Kim, and D. Lee, Photogramm. Eng. Remote Sens. 73,
1241 (2007). doi: 10.14358/PERS.73.11.1241
5. Y. Hu, V. Tao, and A. Croitoru, “Understanding the Rational Function Model: Methods and Applications,” in
Int. Arch. Photogramm. Remote Sens. (2004), pp. 663–668.
https://www.isprs.org/proceedings/XXXV/congress/comm4/comm4.aspx
6. S. Liu and X. Tong, “Rational function model based geo-positioning accuracy evaluation and improvement of
QuickBird Imagery of Shanghai, China,” in Geoinformatics 2008 Jt. Conf. GIS Built Environ. Monit. Assess.
Nat. Resour. Environ. (2008), p. 71452T. doi: 10.1117/12.813089
7. H. Pan, C. Tao, and Z. Zou, ISPRS J. Photogramm. Remote Sens. 119, 259 (2016). doi:
10.1016/j.isprsjprs.2016.06.005
8. T.-A. Teo and L.-C. Chen, “Geometrical Comparisons Between Rigorous Sensor Model and Rational Function
Model for Quickbird Images,” in Proc. KSRS Conf. (2003), pp. 750–752.
https://koreascience.kr/article/CFKO200322941410060.page
9. K.W. Ahn, B.U. Park, G.G. Lee, and D.C. Seo, KSCE J. Civ. Eng. 6, 321 (2002). doi: 10.1007/bf02829154
10. L.C. Chen, T.A. Teo, and C.L. Liu, Photogramm. Eng. Remote Sensing 72, 573 (2006). doi:
10.14358/PERS.72.5.573
11. Airbus, SPOT Imagery User Guide (2006). https://earth.esa.int/eogateway/documents/20142/37627/SPOT-6-7-
imagery-user-guide.pdf
12. A. Maryanto, N. Widijatmiko, W. Sunarmodo, M. Soleh, and R. Arief, Int. J. Remote Sens. Earth Sci. 13, 27
(2016). doi: 10.30536/j.ijreses.2016.v13.a2701
030034-8