Weighted Total Least Squares for the
Visual Localization of a Planetary Rover
Ma You Qing, Liu Shao Chuang, Wen Bo, Zhang Shuo, Li Ming Lei, and Peng Song
Abstract
In this paper, a new visual localization method for a plan-
etary rover’s stereo vision system is presented, which is based
on the weighted total least squares (
WTLS
) algorithm and
Procrustes analysis. The main advantage of the new method
is that the problem solution does not require any initial value
of the planetary rover’s pose parameters, or any linearization
procedure. Moreover, the new method takes account of the
problem that the coefficient matrix and observation vector
of the relative orientation model should also be considered
with regard to error during optimization. The results from
China’s first lunar rover and simulated stereo camera system
demonstrate that the new method can enable the planetary
rover to run in real time and achieve equal localization ac-
curacy (with a mean relative localization accuracy of 4.59%),
higher efficiency (with a computation time of 1 ms) and
direct convergence compared with some classical methods.
Introduction
To achieve autonomous navigation on a planet or in other
celestial body’s environment, a planetary rover must know
its location at all times and be able to plan a reasonable route
based on the current environment. This autonomous navi-
gation capability involves three parts: visual simultaneous
localization and mapping (
SLAM
), path planning and rover
control. Visual
SLAM
, wherein a camera is localized within
an unknown scene, has been well studied in the photogram-
metry, computer vision, and robotics communities. It has
major applications in robotic navigation (Mouragnon,
et al
.,
2006; Achtelik,
et al
., 2011; Kim and Eustice, 2015) and place
recognition (Milford, 2013).
SLAM
can solve the problem of es-
timating camera poses and feature positions using a sequence
of scene images. When visual
SLAM
is applied to a planetary
rover’s navigation and obstacle avoidance systems, the visual
localization problem (Wang,
et al
., 2014; Wan,
et al
.,2014;
Liu,
et al
., 2015) is the key issue of
SLAM
and is important in
tracking moving objects, robot navigation, and map build-
ing. However, there are still unresolved issues in a planetary
rover’s visual localization process.
•
Accuracy
: Visual localization demands accurate relative
orientation parameters that bring the right camera frame
into the left camera frame, which represent the relative
orientation parameters. In general, these parameters can
be calibrated through the use of some control points in the
ground (Fraser, 1997) or a calibration grid tablet (Bouguet,
2000). During the flight of the probe and long-term move-
ment, the relative orientation parameters will change be-
cause of mechanical oscillation and environmental impact.
The relative orientation (Stewenius,
et al
., 2006) method
in photogrammetry or the epipolar rectification method
(Lin,
et al
., 2010) in computer vision can be invoked to
compute the relative orientation parameters using stereo
images when the rover lands or is in operation. However,
those methods, which are based on ordinary least squares
(
OLS
), have some shortcomings: they must satisfy the con-
dition that during optimization, the coefficient matrix is
considered to be error-free, with only the observation vec-
tor having error. However, in reality, the coefficient matrix
containing matching points should be viewed as observa-
tions with errors. Thus, a new relative orientation method
that considers the random errors in both the observation
vector and the coefficient matrix is required.
•
Efficiency
: In photogrammetry, bundle adjustment (
BA
) by
means of
OLS
is a repeated process of space resection and
forward intersection that can refine a visual reconstruc-
tion to produce jointly optimal camera poses and a 3D
structure by minimizing the robustified squared sum of
all possible re-projection errors (Jeong,
et al
., 2012). To
accomplish high-precision camera pose estimation, some
improved algorithms based on
BA
have been proposed,
such as sparse
BA
(Lourakis and Argyros, 2009), sparse
sparse
BA
(Konolige and Garage, 2010), g2o (Kümmerle,
et
al
., 2011), relative
BA
(Blanco,
et al
., 2013), and incre-
mental
BA
(Indelman,
et al
., 2013), which use Euclidean
XYZ
coordinates for inscribing the locations of 3D natural
features. To avoid an ill-conditioned situation, some new
BA
algorithms with non-XYZ coordinates for inscribing
the locations of 3D natural features, such as angles (Zhao,
et al
., 2015), inverse depth (Kwok and Dissanayake, 2004;
Sola,
et al
., 2005), and rays (Davison,
et al
., 2007), have
also been proposed. In (Strasdat,
et al
., 2010), the authors
prove that
SLAM
based on
BA
outperforms an extended
Kalman filter (
EKF
) in terms of consistency and accuracy.
In (Zhao,
et al
., 2015), the authors indicate that paral-
laxBA outperforms
BA
, sparse
BA
, sparse sparse
BA
and
g2o in terms of computing convergence, efficiency and
accuracy should the normal equation be ill-conditioned.
In (Li,
et al
., 2016), the authors present a binocular visual
localization algorithm based on the Levenberg-Marquardt
(
LM
) method from cross-site images. Although
BA
can be
considered the ideal approach to estimating camera poses
and features once the feature correspondence among the
image sequence is built by matching natural features, the
normal equation matrix becomes ill-conditioned when
all camera poses are aligned with the natural features
or when the features are far from the rover. The above
algorithms based on
BA
require several iterations, and the
computing efficiency is drastically reduced. In addition,
Ma You Qing, Liu Shao Chuang, Zhang Shuo, and Li Ming
Lei are with the Institute of Remote Sensing and Digital Earth,
Chinese Academy of Sciences, No.9 Dengzhuang South Road,
Haidian District, Beijing 100094, China.
Wen Bo and Peng Song are with the Beijing Institute of
Spacecraft System Engineering, No.104 Youyi Rd. Haidian
District. Beijing. China (
).
Photogrammetric Engineering & Remote Sensing
Vol. 84, No. 10, October 2018, pp. 605–618.
0099-1112/18/605–618
© 2018 American Society for Photogrammetry
and Remote Sensing
doi: 10.14358/PERS.84.10.605
PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING
October 2018
605