Professional Documents
Culture Documents
Verification of Stereo Camera Calibraiton Using Laser Stripers For A Luna Micro Rover
Verification of Stereo Camera Calibraiton Using Laser Stripers For A Luna Micro Rover
ABSTRACT
Despite advancements in LiDAR technology, stereo
cameras are still preferred for space applications [1],
due to heritage and advantages in weight, power, and
complexity of moving parts. The accuracy of stereo
vision systems is affected by their calibration and even
a small change in the position or orientation, for ex-
ample from vibration or thermal expansion, can cause
significant errors in feature matching or distance esti-
mation. These errors can lead to the failure to identify
obstacles and potentially the mission. Elaborate effort Figure 1: Artistic rendering of MoonRanger
is taken to calibrate these systems under various set-
tings before launch [3]. It is essential to ensure that problem. A laser line striper is a laser diode with a
the calibration parameters still hold after deployment Powell lens[4] which projects a light plane. The light
on a planetary surface. However this is not an easy plane intersects with the ground surface and produces
task due to the procedures and fixtures needed in the lines of points which are observed by cameras. Laser
process. In this work, we develop a method to ver- line stripers were used by Sojourner rover [3] for ob-
ify the stereo calibration in situ using associated stereo stacle avoidance and have space heritage.
pixels produced by a line striping laser. We also show
the sensitivity of the pixel associations to the estimated We use a light plane to verify calibration which is sim-
measurements in simulation. ilar to the traditional checkerboard method. When a
checkerboard is used for calibration, the planar nature
1 INTRODUCTION of the checkerboard is exploited to solve for the cam-
era intrinsic parameters[2]. The same idea can be ex-
Bell,et.al., [3] detail the meticulous effort taken to cal- tended for the light plane. The major difference is: in
ibrate the sensors both pre-flight and in-flight. These the checkerboard method the position of the corners
include characterizing the sensors for different envi- is known precisely. However using the light plane the
ronmental changes, as the calibration done for one en- corners are not known as no assumptions about the ter-
vironment may not hold for another. Various factors rain can be made.
like vibration during launch, structural stress during
In this paper we look into aspects involved in using
landing, thermal expansion during operation can con-
the line striping laser to verify the calibration of stereo
tribute to this issue. Therefore it becomes essential to
cameras. It includes sensitivity analysis to quantify our
develop procedures to enable calibration in the field.
results. We perform these experiments in simulation as
To calibrate stereo extrinsic parameters an object with
known precise dimensions can be used. This could
be the lander itself or any other payload on the lan- • we can obtain absolute ground truth to quantify
der. This becomes challenging if the stereo cameras our results
are not articulated, as is the case for micro-rovers, and • simulation is unaffected by the accuracy of the
not facing the lander, as is the case for navigation cam- pixel extraction or association algorithms which
eras. For these new breed of micro-rovers designed to operate on image domain
operate for brief mission, a few earth days time is an-
other concern. A real-time calibration solution is pre- • experiments are repeatable
ferred. We propose a calibration procedure which es-
timates the stereo extrinsic parameters in real-time in We do recognize the importance and difficulty in eval-
situ. We propose using laser line stripers to solve this uating real data. The efficiency of extraction and as-
i-SAIRAS2020-Papers (2020) 5066.pdf
In the experiments section we will show the perfor- degrees of freedom pose) of the camera with respect to
mance differences between these two methods. the 3D points.
3.1 Ray Tracing We start with the camera projection equation (2) and
use one of the properties of cross product to solve. The
Ray Tracing is a process of projecting a ray from the cross product defined between two 3D vectors given
camera center through the pixel to the world. Let us by,
consider the pixel coordinate uic1 vic1 . The super script
i refers to the ith pixel in the correspondence pairs. To 0 −a3 a2 b1
create the ray, we need two points. The starting point ~a ×~b = [a]×~b = a3 0 −a1 b2
is the camera center, which in the camera frame, is all −a2 a1 0 b3
zeros.
The cross product of a vector with itself is zero,
c c c T T
Sc1 = sx1 sy1 sz 1 = 0 0 0
~a ×~a = 0
The ending point is obtained from the pixel coordinate
which is first normalized and then transformed to the
Start from the projection equation 2:
world coordinate.
X
u Y
iT v = αP
uic −cx vic −cy
T h
c Z
E c1 = ex1 ecy1 ecz 1 = 1
fx
1
fy 1 1
1
The transform which transforms the points in the cam-
era frame to the world frame, in the homogeneous where P = K[R|t] is the projection matrix of dimen-
form, is given by, sion 3 × 4. Note, we introduce α which accounts for
the fact the the projection matrix is up-to scale. The
T
R Tc1
Hcw1 = c1 subscript c2 is dropped for brevity. P X Y Z 1
0 1 is a 3 × 1 vector, called M
The start and the end points of the ray in the world
frame is given by, u
v = αM
T 1
Sw = Hcw1 Sc1 = swx swy swz
Using the start and the end points of the ray we form a
u
vector, v × M = αM × M = 0
T 1
V w = vwx vwy vwz = E w − Sw
T Replacing M,
w
= tswx vwx tswy vwy tswz vwz
I3×1
X
where, u u Y
(aswx + bswy + cswz + d) v × M = v × P
Z = 0
t =− 1 1
avwx + bvwy + cvwz 1
Expanding P,
X
0 −1 v P11 P12 P13 P14
Y
1 0 −u P21 P22 P23 Z = 0
P24
−v u 0 P31 P32 P33 P34
1
0 −1 v P11 X + P12Y + P13 Z + P14
1 0 −u P21 X + P22Y + P23 Z + P24 = 0 Figure 3: Simulation with stereo cameras and a line
−v u 0 P31 X + P32Y + P33 Z + P34 striper with varying size blocks
Rearranging and pulling out the Pi j entries to a vector, We setup an optimization that minimises the pose of
we get 12 unknowns. Each pixel point gives two values c2
u and v. So we need at least 6 points to solve for P.
However in practice we will use a lot more points to
account for the inaccuracies in the pixel locations and q
solve using the least squares approach. arg min ∑ (pwx + qwx )2 + (pwy + qwy )2 + (pwz + qwz )2
Rtc2
Figure 5: For a vertical line striping laser, comparing the error of the estimated poses using the depth-difference and
the PnP method. The PnP method performs with significantly lower error.
Figure 6: For a horizontal line striping laser, comparing the error of the estimated poses using the depth-difference
and the PnP method. The depth-difference method performs better.
6 CONCLUSION did not address the association problem. That is, how
a pixel in one image is associated with the correspond-
We proposed a method to verify stereo extrinsic us- ing pixel in the next image usually by feature matching
ing point correspondences from a line striping laser. techniques like SIFT, SURF or ORB. Recently, deep
We compared the widely used PnP algorithm with learning techniques are also used to get better results.
our depth-difference algorithm. We also showed how Data association for laser striping will be investigated.
pixel noise affects the estimated camera poses. Future Finally, we need to test and validate these findings on
work should focus on considering the error in the esti- real data.
mated angles along with the positions as they will have
Acknowledgment
much greater impact. We modelled the pixel noises are
Gaussian distributions in our simulation for simplicity. The technology, rover development and flight are sup-
However in real world this is not true. These should be ported by NASA LSITP contract 80MSFC20C0008
modelled using more realistic error distributions. We MoonRanger.
i-SAIRAS2020-Papers (2020) 5066.pdf
References
[1] J Maki, et.al., (May 2012) ”The Mars Science
Laboratory Engineering Cameras,”, 1-17.
[2] Z. Zhang (2000) ”A flexible new technique for
camera calibration.” IEEE Transactions on Pattern
Analysis and Machine Intelligence. vol. 22(11), pp.
1330-1334
[3] J. F. Bell III. et.al., (2017) “The Mars Science
Laboratory Curiosity rover Mastcam instruments:”
[4] Laserlineoptics ”Powell lens”
Accessed September 18, 2020
www.laserlineoptics.com/powell primer.html
[5] Roth, Scott D. (February 1982), ”Ray Casting for
Modeling Solids”, Computer Graphics and Image Pro-
cessing, 18 (2): 109–144
[6] Wikipedia ”Perspective-n-Point”
Accessed September 18, 2020
en.wikipedia.org/wiki/Perspective-n-Point
[7] wikipedia ”Axis-angle representation” Accessed
September 18, 2020 en.wikipedia.org/wiki/Axis
[8] Ananth Ranganathan 2004 ”The Levenberg-
Marquardt Algorithm” Tutorial. Accessed September
18, 2020 ananth.in/docs/lmtut.pdf
[9] Sorensen, D. C. (1982). ”Newton’s Method with
a Model Trust Region Modification”. SIAM J. Numer.
Anal. 19 (2): 409–426. doi:10.1137/0719026
[10] ”Opencv PnP RANSAC”
Accessed September 18, 2020
docs.opencv.org/3.4/d9/d0c/group calib3d.html