Tight Integration of INS/Stereo VO/Digital Map
for Land Vehicle Navigation
Fei Liu, Yashar Balazadegan, and Yang Gao
Abstract
This paper proposes a method for tight integration of
IMU
(Inertial Measurement Unit), stereo
VO
(Visual Odometry)
and digital map for land vehicle navigation, which effec-
tively limits the quick drift of
DR
(Dead Reckoning) naviga-
tion system. In this method, the
INS
provides the dynamic
information of the land vehicle, which is used to predict
the position and attitude of cameras in order to obtain the
predicted pixel coordinates of features on the image. The
difference between the measured and predicted pixel coor-
dinates is used to reduce the accumulated errors of
INS
. To
implement the proposed method, an Extended Kalman filter
(
EKF
) is first used to integrate the inertial and visual sen-
sor data. The integrated solution of position, velocity and
azimuth is then applied by fuzzy logic map matching (
MM
)
to project the vehicle location on the correct road link. The
projected position on the road link and the road link azimuth
can finally be used to reduce the dead reckoning drifts. In
this way, the accumulated system errors can be significantly
reduced. The testing results indicate that the horizontal
RMSE
(root-mean-square-error) of the proposed method is
less than 20 meters over a traveled distance of five kilome-
ters and the relative horizontal error is below 0.4 percent.
Introduction
It has drawn increasing research and industrial attention to
development of alternative navigation solutions when
GNSS
(Global Navigation Satellite System) signals are not available
in order to achieve seamless navigation. The most widely ap-
plied alternative navigation system is an
INS
(Inertial Navi-
gation System), in which
IMU
(Inertial Measurement Unit)
integrates the rotation rates to obtain orientation changes, and
doubly integrates the accelerations to obtain velocity and posi-
tion increments (Jekeli, 2001).
INS
can work independently
in most cases. However, the main drawback of
INS
is that the
errors accumulate with time, leading to quick drift in posi-
tioning results, especially for low-cost
MEMS
(Micro-Electro-
Mechanical Systems)-based
IMU
(Sukkarieh, 2000; Petovello,
2003; Shin, 2005). Besides
INS
, visual odometry (
VO
) becomes
another widely applied technique to bridge the
GNSS
outage.
Similar to
INS
,
VO
can be self-contained without external posi-
tioning information when there is sufficient illumination and
enough features can be detected and tracked (or matched) in
consecutive images.
VO
utilizes the change of static features on
images to calculate the rotation and translation of camera (Nis-
tér,
et al
., 2004). The stereo
VO
is more widely applied because
the depth ambiguity can be eliminated by triangulation of two
cameras, compared to monocular
VO
in which only feature
orientation is known with a single camera (Badino, 2007). In-
tegration of stereo
VO
and
INS
to reduce the quick drift of each
individual sensor has been studied by many researchers (Stre-
low, 2004; Veth and Raquet, 2007; Konolige
et al
., 2010; Tardif
et al.
, 2010; Sirtkaya
et al
., 2013; Asadi and Bottasso, 2014).
Since each individual system can independently provide
the navigation solutions, stereo
VO
and
INS
can be integrated
in a loosely coupled scheme where a Kalman filter fuses the
outputs of each sensor to achieve optimal results (Sirtkaya
et
al
., 2013; Konolige
et al
., 2010; Tardif
et al
., 2010). Apart from
loosely coupled integration, several tightly coupled methods
have been explored as well. In tightly coupled integration, the
pixel coordinates of features are applied in a centralized filter
to optimally estimate the platform ego-motion (Strelow, 2004;
Asadi and Bottasso, 2014; Liu
et al
., 2015).
However, as mentioned above, both
INS
and stereo
VO
adopts Dead Reckoning (
DR
) algorithms to provide the plat-
form ego-motion, resulting in inevitable drift without exter-
nal absolution positioning information. In addition to
GNSS
signals, another way to provide external positioning informa-
tion is map matching. Map matching projects the position
obtained on the correct road link, on which the vehicle is
moving, based on the outputs of the integrated system (e.g.,
position, velocity, and azimuth) and the road network data
(e.g., nodes, line segments, and link azimuth) (Ochieng
et
al
., 2003; Taylor
et al.
, 2006; Quddus
et al
., 2007; Ren and
Karimi, 2009). The map matched points on the digital map
could further be used to force the estimated trajectory back to
the road links.
In this paper, map matching is applied to mitigate un-
bounded error drifting in
INS
and stereo
VO
systems. In this
way, more accurate solutions can be provided in
GNSS
-denied
environments. Floros
et al
. (2013) adopted map matching
for stereo visual odometry based on chamfer matching and
particle filter. In their method, the particles representing the
vehicle position are generated based on the
VO
results and
chamfer matching is applied to find out the most matched
traveled trajectory. Then, the optimal vehicle localization can
be determined by a particle filter. For this method, mismatch-
ing may happen if the vehicle is traveling in an area where
the shapes of road links are very similar to those around (e.g.,
mesh-like road links). In contrast, our map matching method
is based on the vehicle moving state obtained by
INS
/Stereo
VO
, and digital map information using fuzzy logic map match-
ing algorithms. In this way, such mismatching can be avoided.
Brubaker
et al
. (2013) proposed using visual odometry and
a digital map to determine the probabilistic location on the
digital map, which could effectively maintain good accuracy
for a long time. The method proposed in this work is based
Fei Liu is with the University of Calgary. 2911 Brentwood
Blvd., NW, Calgary, AB, Calgary, Canada, T2L 1J6 (Fliu.uc@
hotmail.com).
Yashar Balazadegan is with the University of Calgary, 2544
Morley Trail, NW, Calgary, AB, Canada, T2M 4G5.
Yang Gao is with the University of Calgary, 2500 University
Dr., NW, Calgary, AB, Canada, T2N 1N4.
Photogrammetric Engineering & Remote Sensing
Vol. 84, No. 1, January 2018, pp. 15–23.
0099-1112/17/15–23
© 2017 American Society for Photogrammetry
and Remote Sensing
doi: 10.14358/PERS.84.1.15
PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING
January 2018
15