You are on page 1of 9

RESEARCH ARTICLE | DECEMBER 11 2023

A proposed method of systematic geometric correction for


LAPAN-A4 satellite data 
Musyarofah  ; Dinari Nikken Sulastrie Sirin; Hanna Afida; Rise Hapshary Surayuda; Patria Rachman Hakim;
Satriya Utama; Budhi Gustiandi; Mohammad Mukhayadi; A. Hadi Syafrudin; Ahmad Maryanto

AIP Conf. Proc. 2941, 030034 (2023)


https://doi.org/10.1063/5.0181438

CrossMark

 
View Export
Online Citation

14 December 2023 12:07:02


A Proposed Method of Systematic Geometric Correction for
LAPAN-A4 Satellite Data
Musyarofah1, a), Dinari Nikken Sulastrie Sirin1, b), Hanna Afida3, c), Rise Hapshary
Surayuda1, d), Patria Rachman Hakim2, e), Satriya Utama2, f), Budhi Gustiandi1, g),
Mohammad Mukhayadi2, h), A. Hadi Syafrudin2, i), and Ahmad Maryanto1, j)
1
Research Center for Remote Sensing, BRIN, Bogor, Indonesia.
2
Research Center for Satellite Technology, BRIN, Bogor, Indonesia.
3
Center for Data and Information, BRIN, Jakarta, Indonesia.
a)
Corresponding author: musyarofah@brin.go.id
b)
dinari.nikken.sulastrie.sirin@brin.go.id
c)
hanna.afida@brin.go.id
d)
rise.hapshary.surayuda@brin.go.id
e)
patria.rachman.hakim@brin.go.id

14 December 2023 12:07:02


f)
satriya.utama@brin.go.id
g)
budhi.gustiandi@brin.go.id
h)
mohammad.mukhayadi@brin.go.id
i)
a.hadi.syafrudin@brin.go.id
j)
ahmad.maryanto@brin.go.id

Abstract. Indonesia's fourth microsatellite, LAPAN-A4, is planned to be launched at the end of 2022. The satellite will
carry nine payloads, two of which are line imager sensors named SLIM4 and ELLISA. The payloads have four
multispectral bands, which are R, G, B, and NIR. In order to get an accurate image, a correct and precise image
processing technique is required. One of these image processing steps is a geometric correction. Geometric correction or
georeferencing is the process of associating the image geometry to the actual magnitude and units on earth, which is done
by equating the position of the target location in the image with the precise location on earth. There are two types of
georeferencing, the rigorous sensor model (physical model) and the generalized sensor model (empirical model). The
rigorous sensor model (RSM) is believed to be the most reliable method that can be applied to the overall image
generated by the imaging system. It is the best choice for areas where in situ ground control point (GCP) measurements
are not possible. It has been applied in SPOT-4,5,6,7 and Landsat-8 satellite data. This paper proposes a method of
systematic geometric correction for LAPAN-A4 satellite data generated by SLIM4 and ELLISA sensors.

INTRODUCTION
The development of remote sensing technology currently allows the relatively high availability of image data with
various formats and different processing levels [1]. The various types of sensors developed for satellite-based
remote sensing and the increased availability of the data have led to the growing use of satellite imagery data for
various applications, such as agriculture, forestry, environmental monitoring, urban planning, military,
transportation, etc. However, the utilization of this satellite image data depends on several factors, such as sensor
characteristics (geometric and radiometric qualities), the quality of the processing software used, the types of image
products available, and the quality of the final images. One of the primary obstacles that greatly affects the
utilization of satellite imagery is the sensor model, whether the sensor model used can provide a high level of
geometric correction through the orientation of the resulting image or not.

The 9th International Seminar on Aerospace Science and Technology – ISAST 2022
AIP Conf. Proc. 2941, 030034-1–030034-8; https://doi.org/10.1063/5.0181438
Published by AIP Publishing. 978-0-7354-4755-4/$30.00

030034-1
The sources of distortion that affect the quality of the image acquired by remote sensing satellites can generally
be categorized into two types, distortion from the acquisition system and distortion from the atmospheric refraction
[2]. The distortion of the acquisition system includes the orientation and the movement of the platform, as well as
the optical-geometric characteristics of the imaging sensor, whereas the distortion of atmospheric refraction is
related to the large number of particles in the atmosphere that refract the light. Distortions caused by the atmospheric
refraction are usually neglected for operation around nadir pointing because their effect is relatively small on the
geometric quality of the resulting image. Noerdlinger (1999) shows that atmospheric refraction causes ground
displacements of 0.55, 1.22, and 2.22 m for respective 10°, 20°, and 30° off-nadir angles using a very approximate
atmosphere model [3]. Meanwhile, the distortion caused by the acquisition system is quite influential on the
geometric quality of the image. For this reason, an appropriate orientation-based sensor model, also known as the
geometric model, is required for the systematic geometric correction of satellite images. This modeling is a
fundamental step in photogrammetric processing and very important for analysis and multi-source data fusion.
Geometric modeling describes the relationship between image and ground coordinates for a given sensor.
Various methods have been developed to formulate the mathematical relations between image and ground
coordinates [4]. In principle, there are two different types of orientation-based sensor model, namely the physical
sensor model (also called the rigorous sensor model) and the generalized sensor model (also called the empirical
model or the rational function model) [2]. In the physical model, the image is correlated with the ground coordinates
by using the collinearity equations, and each involved parameter has a physical meaning. This method requires
knowledge of the specific satellite, orbit, and sensor characteristics. On the contrary, the empirical model or rational
function model (RFM) links the image and the terrain coordinates by using the rational polynomial coefficients
(RPC). This model is usually based on the rational polynomial functions (RPF) and does not need knowledge about
sensor and acquisition features [2].
The empirical method formulates mathematical relations through statistical matching or fitting of several
coordinates of image points to known coordinates of their paired points on the earth's surface. The mathematical
relations built are generally in the form of polynomials which is then used to calculate all other image points in the

14 December 2023 12:07:02


image scene being reviewed. The empirical method does not require any information about the mechanism of image
formation, such as imaging time and how the sensor performs the imaging process. For users who do not have any
initial data on the construction of the imaging system, the external orientation, and the position of the sensor during
image acquisition, this method can be the most appropriate solution. Thus, the precision level of the mathematical
relations produced will depend on how precise and representative the field measurement data (the reference data)
are. This method has been applied to some satellite data, i.e., QuickBird PAN, IKONOS, FORMOSAT-2, and
ZiYuan-3 [5, 6, 7, 8, 9, 10].
However, in the physical model, the mathematical relation that links coordinate points of the image and the
ground control points on the earth is derived directly from the vector formation built by considering the viewing
geometry of the acquisition system (sensor), the earth model used, and the sensor position vector at the time of
imaging. Therefore, for some experts, the physical model has also been termed a direct georeferencing method. The
physical model or the rigorous sensor model (RSM) requires detailed data on the sensor system as a physical entity
that explains the origin of the viewing vector or the direction of the view of an image pixel in the sensor reference
frame, and the subsequent physical interface that connects the sensor to the platform (satellite) to the earth
coordinate system that becomes the final target of the vector transformation of the image’s pixel view.
The RSM requires no ground control points (GCP) except for inspection or evaluation of the final results.
However, it requires detailed data that describes the sensor’s position and attitude at the imaging time. The
mathematical relation built is a mapping relation (one-by-one relation) which is defined with the highest degree of
certainty. The RSM is thus the most reliable method that can be applied to the overall image generated by the
imaging system and will be the only best choice for areas where in situ ground control point measurements are not
possible. In addition, the RSM has higher accuracy than the RFM for digital elevation model (DEM) height accuracy
[9].
At the implementation level, the RSM has been applied as a standard method for geometric processing of various
pushbroom-type remote sensing sensors including SPOT-4,5,6,7 and Landsat-8. In the SPOT-6/7 upright image raw
product, which is fully processed using the rigorous sensor method without a ground control point (GCP), the
performance of the model used is reported to be able to provide location accuracy of up to 20 meters, exceeding the
design performance requirement of 50 meters [11].
In Indonesia, RSM has been operationally applied to the geometric data processing system of international
remote sensing satellites data retrieved by formerly National Institute of Aeronautics and Space of Indonesia
(LAPAN) particularly for SPOT-2,3,4,5,6,7 satellite data. However, the studies and the development of the method

030034-2
have not been carried out further for the in-house LAPAN satellites. Previous researches conducted by formerly
Remote Sensing Technology and Data Center, LAPAN (now Research Center for Remote Sensing, BRIN) tried to
implement the RSM for its in-house aerial camera (Inderaku-2A) [12, 13]. Moreover, formerly Satellite Technology
Center, LAPAN (now Research Center for Satellite Technology, BRIN) has done several researches on systematic
image preprocessing for LAPAN-A3 satellite data with pushbroom imager’s specification of 15-meter resolution
and 120-kilometer swath-width. These researches have yielded some georeferencing methods resulting in the mean
accuracy of 3.80 km in the across-track and 10.69 km in the along-track direction using GPS data and satellite
attitude data as well as the mean accuracy of 781 m in the across-track direction of red-band image only using image
matching method [14, 15, 16]. The results of the georeferencing methods developed for LAPAN-A3 satellite data
are not as good as the accuracy of SPOT satellite image data using the RSM method. Hence, for the LAPAN-A4
satellite, which is planned to carry several pushbroom sensors, the geometric processing model that can produce
better results than LAPAN-A3 is needed to accommodate the characteristics of these sensors.
This paper only focuses on the main pushbroom sensor, ELLISA, carried by LAPAN-A4 satellite and the
proposed systematic geometric correction based on RSM.

LAPAN-A4 SATELLITE
LAPAN-A4 satellite is the fourth microsatellite developed by formerly Satellite Technology Center, LAPAN (now
Research Center for Satellite Technology, BRIN) which is planned to be launched at the end of 2022. It is designed
to be in the sun synchronous polar orbit at an altitude of 500 km. The satellite has missions for earth observation,
ship traffic monitoring, space magnetic field measurements, and thermal infrared acquisition [17]. Payloads carried
by this satellite consist of two types, non-optical payloads and optical payloads.
There are four non-optical payloads consisting of Automatic Identification System (AIS), Hybrid Fluxgate
Magnetometer (HFGM), Internet of Things (IoT), and stepper motor for payload. As for the optical payloads, there
are five instruments, namely Medium Resolution Imager – STTL Line Imager 4-band (SLIM4), Experimental

14 December 2023 12:07:02


LAPAN Line Imager Space Application (ELLISA), Short Wave Infrared (SWIR), Long Wave Infrared A4 (LWIR-
A4), and Long Wave Infrared BPPT/Hokkaido A4 (LWIR BPPT/Hokkaido-A4) [18]. The optical payloads carried
by this satellite are listed in Table 1.

TABLE 1. Specification of LAPAN-A4 optical payloads


Spatial Swath Width Revisit Time
Payload Sensors
Resolution (m) (km) (day)
ELLISA 5-10 33 79
SLIM4 16 230 11
SWIR 20 34 77
LWIR-A4 56 34 77
LWIR BPPT 146 47 56

ELLISA is an imager developed by formerly LAPAN (now BRIN). It has medium resolution (ELLISA-MR) and
high resolution (ELLISA-HR). The ELLISA-HR has three multispectral bands, which are Red (R), Near Infrared
(NIR), and Panchromatic (PAN). While the ELLISA-MR, similar to SLIM4, has four multispectral bands, namely
Red (R), Green (G), Blue (B), and Near Infrared (NIR).

THE RIGOROUS SENSOR MODEL OF PUSHBROOM CAMERA

Based on the standard photogrammetric approach, the rigorous sensor model (RSM) uses collinearity equations to
describe the physical-geometrical image acquisition. This model tries to describe the physical properties of the
image acquisition through the relationship between image and ground coordinate system. Since the sensor is a
pushbroom-type, the image is formed by many individual lines. Each line is acquired with proper position and
attitude values, which is the result of perspective projection. Therefore, the mathematical model is based on
collinearity equations. All positions during image acquisition are related by the orbital dynamics. Thus, the RSM is
based on the reconstruction of the orbital segment during image acquisition through the knowledge of the internal
orientation (the acquisition mode and the sensor parameters) and external orientation (the satellite position and

030034-3
attitude parameters). All the parameter values can be approximated by computation using information contained in
the image metadata file and then corrected by Least Square estimation processes. In order to link the image to the
ground coordinates in an Earth-Centered Earth-Fixed (ECEF) reference frame, a set of translation and a set of
rotation matrices as well as sensor attitudes are required. To determine the object location on the ground, several
transformations are conducted, from image coordinate system to detector coordinate system, from detector to
camera coordinate systems, from camera to the Earth coordinate systems, and from the Earth to the object
coordinate systems.

Direct Georeferencing of Pushbroom Image Based on the Rigorous Sensor Model

In principle, direct georeferencing is conducted to estimate the ground coordinates of the homologous points
measured in the image through the forward intersection using the internal orientation (the results of
calibration/laboratory measurements) and the external orientation (position and attitude data provided by the
geoposition/geolocation system carried by the platform). The homologous points are the points that are
simultaneously formed in an image acquisition process (snapshot). For the pushbroom sensor, based on the detection
array arrangement, the homologous points are a row of pixels that form an image line. The direct georeferencing
approach does not require any ground control points (GCP), except for final validations, and does not need to
estimate any additional parameters that model the interior and exterior orientations. For this reason, the effectiveness
and reliability of direct georeferencing methods based on the RSM are dependent on the accuracy of the internal and
external orientations of data provided. This method can be formulated in a vector relation that represents the
geometric formation of the center of the earth and the viewing center between the sensor and the object point (see
Fig. 1).
The viewing point is the intersection point of the extension of the image pixel's vector view with the earth
model's surface.

14 December 2023 12:07:02


FIGURE 1. Illustration of the viewing point from the satellite to the earth.

⃗⃗⃗⃗⃗⃗
𝑂𝑀 = ⃗⃗⃗⃗⃗ ⃗⃗⃗⃗⃗⃗ or 𝒔 − 𝒈 = 𝜇𝒊
𝑂𝑃 + 𝑃𝑀 (1)

Eq. (1) consists of g which is the position vector of ground coordinate, s which is the position vector of sensor
when acquiring image, and i which is the unit vector of pixel viewing to earth. For computational reason, Eq. (1) is
written in the form of vector element matrix:

030034-4
𝑋𝑃 − 𝑋 𝑋𝑖
[ 𝑌𝑃 − 𝑌 ] = 𝜇 ∗ [ 𝑌𝑖 ] (2)
𝑋𝑃 − 𝑋 𝑍𝑖

with 𝜇 as a scaling factor that scale up vector of pixel viewing to earth model.
As a part of the earth’s ellipsoid, the elements of position vector of M point (X, Y, Z) also satisfy equation of the
earth’s ellipsoid:

𝑋 2 +𝑌 2 𝑍2
(𝑎+ℎ)2
+ (𝑏+ℎ)2 = 1 (3)

with 𝑎 as the earth’s major axis (semi major axis), 𝑏 as the earth’s minor axis (semi minor axis), and ℎ as the height
of M point from the earth model surface.
The relation mentioned above is a simplification by assuming P as the satellite center point (the center of mass)
which coincides with the center of view of the sensor system. In practice, P cannot always coincide perfectly with
the center of view of the sensor, and the satellite position sensor cannot always coincide with the center of mass of
the satellite either. Therefore, the direct georeferencing model must carefully identify the location of these points
and then determine the vector precisely. A translation matrix connects the coordinate system of each of these points
to a coordinate system centered on the sensor's center of view (the origin of the view vector i), as well as the
construction in the sensor system that builds the internal view vector of the image pixels.

Image Orientation
The systematic geometric correction based on the RSM uses image orientation to derive the collinearity equations.
Image orientation is a viewing direction or pixel viewing direction that consists of the internal orientation, the

14 December 2023 12:07:02


external orientation, and the final orientation.

FIGURE 2. Internal orientation of the optical system.

The internal orientation or the internal viewing direction is a viewing direction from the image which is built by
the image formation mechanism of the optical system inside the camera.

030034-5
0
𝑁𝑝
𝑔𝑘 = [(𝑣 −
⃗⃗⃗⃗ 2
) 𝑦𝑝 ] (4)
𝑓

0
𝑁𝑝
⃗⃗⃗⃗𝑘 = [( 2 − 𝑣) 𝑦𝑝 ]
𝑔𝑏𝑘 = −𝑔
⃗⃗⃗⃗⃗⃗ (5)
−𝑓

In the equation, gbk is the internal viewing direction of image pixel G to the object on the ground according to the
camera coordinate system, f is the focal length of the camera lens, Np is the number of detector cells in an array, yp is
the detector pitch, and v is the column number of the image line.
The external orientation is all actions that cause the internal orientation to change from the original direction,
such as the tilt of the camera when it was installed on the satellite body and the attitudes of the satellite body (pitch,
yaw, roll) during image acquisition [19].

cos Ѱ cos Κ sin Ω sin Ѱ cos Κ + cos Ω sin Κ − cos Ω sin Ѱ cos Κ + sin Ω cos Κ
𝐓𝑜𝑏𝑗𝑒𝑐𝑡−𝑠𝑎𝑡𝑒𝑙𝑙𝑖𝑡𝑒 = [− cos Ѱ sin Κ − sin Ω sin Ѱ sin Κ + cos Ω cos Κ cos Ω sin Ѱ sin Κ + sin Ω cos Κ ] (6)
sin Ѱ − sin Ω cos Ѱ cos Ω cos Ѱ

with , K, and Ω are the values of pitch, yaw, and roll of the satellite. These values can be obtained from the
satellite ephemeris data. It is a transformation from the satellite orientation to the local orbital orientation.

14 December 2023 12:07:02


FIGURE 3. Satellite coordinate system.

The final orientation is the viewing direction of image pixel in the earth reference system (the earth coordinate
system), a transformation from the orbital frame to Earth-Centered Earth-Fixed (ECEF).

𝐓𝐸𝑎𝑟𝑡ℎ−𝑜𝑏𝑗𝑒𝑐𝑡 = [𝑋̂𝑜𝑏𝑗𝑒𝑐𝑡 𝑌̂𝑜𝑏𝑗𝑒𝑐𝑡 𝑍̂𝑜𝑏𝑗𝑒𝑐𝑡 ] (7)

⃗ 𝑜𝑏𝑗𝑒𝑐𝑡 ×𝑍𝑜𝑏𝑗𝑒𝑐𝑡
𝑌
𝑋̂𝑜𝑏𝑗𝑒𝑐𝑡 = ⃗ 𝑜𝑏𝑗𝑒𝑐𝑡 ×𝑍𝑜𝑏𝑗𝑒𝑐𝑡 ‖
(8)
‖𝑌

⃗ (𝑡)
𝑍𝑜𝑏𝑗𝑒𝑐𝑡 ×𝑉
𝑌̂𝑜𝑏𝑗𝑒𝑐𝑡 = ⃗ (𝑡)‖
(9)
‖𝑍𝑜𝑏𝑗𝑒𝑐𝑡 ×𝑉

𝑆(𝑡)
𝑍̂𝑜𝑏𝑗𝑒𝑐𝑡 = (10)
‖𝑆(𝑡)‖

030034-6
DEVELOPMENT OF DIRECT GEOREFERENCING MODEL OF LAPAN-A4

To develop the direct georeferencing model based on the rigorous sensor model (RSM), the collinearity equations
are required. The formulation of a mathematical relation that expresses the collinearity of the pixel position vector
element in the image coordinate system relating to the object position vector element on the earth in the earth
coordinate system is carried out through the following steps. The first step is the formulation of the viewing vector
(the viewing direction) of pixels according to the camera coordinate system (the formulation of the internal
orientation). The next step is performing sequential transformations of the internal orientation into the viewing
vector according to a reference frame centered at the center of the earth by involving camera attitude data and all the
related coordinate systems. The last step is to intersect (forward intersection) the image pixel viewing vector with
the earth’s ellipsoid model to determine the position vector that represents the location of the image pixels in the
earth coordinate system. All the steps should be arranged, and the necessary data should be provided in order to
implement this model in an algorithm flow.
The steps to formulate the collinearity equations are as follows. First step is to define all related coordinate
systems, which are the image system, the sensor/detector system, the body/camera system, the flight/satellite system,
the orbital system, the Earth Centered Inertial (ECI) system, the Earth-Centered Earth-Fixed (ECEF) system, and the
geodetic local system. The next step is to define orbital parameters of the satellite orbit, the Keplerian elements that
consist of semi-major axis, orbit inclination, right ascension of the ascending node, eccentricity, true anomaly,
argument of the perigee, and time of the perigee passage. The approximate values of the Keplerian elements can be
computed based on the ephemeris information in the metadata file. Afterwards, the attitude angles of the sensor
during acquisition are defined. The information required are the pitch, yaw, and roll angles that can be approximated
by calculating the data in the metadata file. The approximate values can be corrected by modeling the data with the
second polynomial order or another polynomial order. The polynomial order can be determined by performing
experiments. Finally, defining the system coordinate transformation is performed.
To enforce a direct georeferencing model based on the RSM for LAPAN-A4 satellite, several data or

14 December 2023 12:07:02


information are required. They are the technical data for the internal orientation determination (f, px, Np), the
technical data to determine the external orientation by design (definition of the camera satellite coordinate systems,
the alignment of the camera satellite coordinate systems, and the origin of the camera satellite coordinate systems),
the technical data of the calculation model of the external orientation by nature (the sensor attitudes), the technical
data of the measurement models for determining the point of view of the viewing direction and the local orbiting
coordinate system (the position and velocity vector of the satellite), the technical data of synchronization time
between data (images and metadata), and the scanning repetition data.

CONCLUSION

The rigorous sensor model (RSM) is the most widely used direct georeferencing method because it does not require
any ground control points except for evaluating the final results. The RSM produces a one-by-one mapping
relationship that is defined with certainty due to the position and attitude information of the sensor at the time of
imaging. This method has been applied for standard geometric processing of various pushbroom-type remote
sensing sensors, such as SPOT-4,5,6,7 and Landsat-8. In the SPOT-6/7, the performance of the model used can
provide location accuracy of up to 20 meters, exceeding the design performance of 50 meters.
It requires some data to be able to apply this method on the LAPAN-A4 satellite, i.e., the technical data for
internal orientation, the technical data to determine the external orientation by design, sensor attitude (pitch, yaw,
and roll), satellite’s ephemeris data, images and metadata, and also scanning repetition.
The proposed method in this paper can be tested using data LAPAN-A3. There are simulation procedures that
can be carried out: 1) performing model adaptation to the physical sensor construction of the LAPAN-A3, 2)
calculating the earth coordinates of several LAPAN-A3 image pixels using the developed model for coastal,
lowland, and highland regions, and 3) comparing the location coordinates between computational model results, the
field measurements, and the SPOT-4/5 standard image product.

ACKNOWLEDGMENTS

The authors would like to thank Mr. Suhermanto as the Mentor of WBS1 for the support given during this research.

030034-7
REFERENCES

1. A.P. Cracknell, Int. J. Remote Sens. 39, 8387 (2018). doi: 10.1080/01431161.2018.1550919
2. T. Toutin, Int. J. Remote Sens. 25, 1893 (2004). doi: 10.1080/0143116031000101611
3. P.D. Noerdlinger, ISPRS J. Photogramm. Remote Sens. 54, 360 (1999). doi: 10.1016/S0924-2716(99)00030-1
4. A. Habib, S.W. Shin, K. Kim, C. Kim, K. Bang, E. Kim, and D. Lee, Photogramm. Eng. Remote Sens. 73,
1241 (2007). doi: 10.14358/PERS.73.11.1241
5. Y. Hu, V. Tao, and A. Croitoru, “Understanding the Rational Function Model: Methods and Applications,” in
Int. Arch. Photogramm. Remote Sens. (2004), pp. 663–668.
https://www.isprs.org/proceedings/XXXV/congress/comm4/comm4.aspx
6. S. Liu and X. Tong, “Rational function model based geo-positioning accuracy evaluation and improvement of
QuickBird Imagery of Shanghai, China,” in Geoinformatics 2008 Jt. Conf. GIS Built Environ. Monit. Assess.
Nat. Resour. Environ. (2008), p. 71452T. doi: 10.1117/12.813089
7. H. Pan, C. Tao, and Z. Zou, ISPRS J. Photogramm. Remote Sens. 119, 259 (2016). doi:
10.1016/j.isprsjprs.2016.06.005
8. T.-A. Teo and L.-C. Chen, “Geometrical Comparisons Between Rigorous Sensor Model and Rational Function
Model for Quickbird Images,” in Proc. KSRS Conf. (2003), pp. 750–752.
https://koreascience.kr/article/CFKO200322941410060.page
9. K.W. Ahn, B.U. Park, G.G. Lee, and D.C. Seo, KSCE J. Civ. Eng. 6, 321 (2002). doi: 10.1007/bf02829154
10. L.C. Chen, T.A. Teo, and C.L. Liu, Photogramm. Eng. Remote Sensing 72, 573 (2006). doi:
10.14358/PERS.72.5.573
11. Airbus, SPOT Imagery User Guide (2006). https://earth.esa.int/eogateway/documents/20142/37627/SPOT-6-7-
imagery-user-guide.pdf
12. A. Maryanto, N. Widijatmiko, W. Sunarmodo, M. Soleh, and R. Arief, Int. J. Remote Sens. Earth Sci. 13, 27
(2016). doi: 10.30536/j.ijreses.2016.v13.a2701

14 December 2023 12:07:02


13. D. Monica, A. Maryanto, W. Sunarmodo, and N. Widijatmiko, “Geometric correction model for dual sensor
pushbroom aerial camera,” in IOP Conf. Ser. Earth Environ. Sci. (2020), p. 012058. doi: 10.1088/1755-
1315/500/1/012058
14. P.R. Hakim, A.H. Syafrudin, S. Salaswati, S. Utama, and W. Hasbi, IJASCSE Int. J. Adv. Stud. Comput. Sci.
Eng. 7, (2018). doi: 10.48550/arXiv.1901.09189
15. P.R. Hakim, A.H. Syafrudin, S. Utama, and W. Hasbi, “Satellite Attitude Determination Based on Pushbroom
Image Band Coregistration,” in ICARES 2018 - Proc. 2018 IEEE Int. Conf. Aerosp. Electron. Remote Sens.
Technol. (IEEE, 2018), pp. 96–102. doi: 10.1109/ICARES.2018.8547145
16. P.R. Hakim, A.P.S. Jayani, A. Sarah, and W. Hasbi, “Autonomous Image Georeferencing Based on Database
Image Matching,” in ICARES 2018 - Proc. 2018 IEEE Int. Conf. Aerosp. Electron. Remote Sens. Technol.
(IEEE, 2018), pp. 90–95. doi: 10.1109/ICARES.2018.8547086
17. M.A. Saifudin, A. Karim, and Mujtahid, “LAPAN-A4 Concept and Design for Earth Observation and
Maritime Monitoring Missions,” in ICARES 2018 - Proc. 2018 IEEE Int. Conf. Aerosp. Electron. Remote Sens.
Technol. (IEEE, 2018), pp. 44–48. doi: 10.1109/ICARES.2018.8547143
18. E.A. Anggari, A. Herawan, S. Salaswati, P.R. Hakim, and A.H. Syafrudin, “Preliminary Study of the Potential
Utilization of Imagery from LAPAN-A4 Satellites,” in AIP Conf. Proc. (AIP Publishing, 2021), p. 060005.
doi: 10.1063/5.0060383
19. Musyarofah, D.N.S. Sirin, and A. Maryanto, “Pengembangan Relasi Matematik Sensor Pushbroom untuk
Koreksi Geometrik Sistematik Data Citra Satelit LAPAN A3,” in Pros. Siptekgan (2011), pp. 577–588.
https://karya.brin.go.id/id/eprint/11903/1/Prosiding_Musyarofah_Pustekdata_2011.pdf

030034-8

You might also like