PE&RS January 2018 Full - page 16

on the
IMU
/
VO
solutions and fuzzy logic map matching and
sample training are not needed. Additionally, since correct
map matching can be guaranteed, the map matched point is
directly used for dead reckoning system in next epoch. The
map matching results obtained by in this paper are better than
using particle filter mentioned above.
The tightly coupled integration scheme of
INS
and stereo
VO
adopted in this paper was proposed by Liu
et al
. (2015).
Specifically, the outputs of the
IMU
provides the initial camera
position and attitude, which can be applied to predict the
pixel coordinates of the features on images. With detected and
matched features available, the measured pixel coordinates
can be applied to calculate the optimal camera pose using
a centralized Kalman filter. This paper adopts the conven-
tional 15-parameter error-state system model derived from
the linearization of
IMU
mechanization equations of
IMU
. The
measurement update in the
EKF
is the difference between the
predicted pixel coordinates of features by re-projection and
the measured pixel coordinates on the images. In this way, the
quick drift of each individual
DR
sensor can be limited. Then,
the outputs of the integrated system (e.g., position, velocity,
and azimuth) can be applied in map matching to find out
the corresponding point on the correct road link. The map-
matched point position and the azimuth of the road link can
be further applied for
IMU
mechanization in next epoch. With
this method, the accumulated errors can be largely reduced
since system trajectory is forced to the digital map if matched
points on the digital map can be found.
The paper is organized as follows. In the next Section, the
implementation of tightly coupled integration of
INS
/Stereo
VO
is introduced, followed by the map matching algorithms
adopted in this paper. The next section is about the method
for land vehicle localization using
INS
, stereo
VO
, and map
matching and leads to the test results. Finally, the conclusions
and discussions are given in the last section.
Tightly Coupled Integration of INS/Stereo VO
In this section, the theories related to the inertial navigation,
visual odometry, and the tightly coupled integration scheme
are introduced.
Inertial Navigation
The mechanization of the
IMU
provides the dynamic informa-
tion of the land vehicle. In this paper, the
IMU
mechanization
is conducted in the
LLF
(Local-level Frame) with
IMU
body
frame defined as right, forward and up while the
LLF
defined
as ENU (East-North-Up). The mechanization equations can be
given as (Noureldin
et al
., 2012):
r
V
R
D V
R f
V g
R
l
l
b
l
l
b
l b
ie
l
el
l
l
l
b
l
ib
b
=
+
(
)
+
1
2
Ω Ω
Ω −
(
)
il
l
(1)
where ‘
l
’ and ‘
b
’ represent the local-level frame and
IMU
body
frame,
r
is the positioning,
V
is the velocity,
R
l
b
is the rota-
tion matrix from
IMU
body frame to the local-level frame; the
dot means the time derivatives,
c
ab
is the skew-symmetric
matrix of the rotation rate
ω
c
ab
, which represents the rotation
rate of frame ‘
b
’ relative to frame ‘
a
’ expressed in frame ‘
c
’,
f
b
is the acceleration measured by
IMU
in its body frame,
ω
b
ib
is
measured angular rate by gyros,
ω
l
ie
and
ω
l
el
are the earth rota-
tion rate projected in the local-level frame and the transport
rate caused by the change of orientation of the local-level
frame, respectively,
ω
l
il
is the sum of the
ω
l
ie
and
ω
l
el
. Since the
positioning is expressed in latitude, longitude, and height
while the velocity used is in east, north and up, the geo-
graphic coordinates update with velocity in
LLF
can be given
as (Noureldin
et al
., 2012):
r
h
D
R h
R h
V V
l
T
m
n
l
=
[
]
=
+
+
(
)
=
φ λ
φ
1
0
1
0
1
0 0
0
0 1
cos
e n u
T
V V
[
]
(2)
where
R
m
,
R
n
are the meridian radius of curvature and the
radius of curvature of the prime vertical respectively,
φ
,
λ
,
h
represent latitude, longitude, and height respectively,
V
e
,
V
n
,
V
u
are the velocity in east, north and up directions in the
local-level frame. The acceleration and rotation sensed by
IMU
are in
IMU
body frame, which needs to be transformed to the
LLF
. The third equation in Equation 1 indicates the update
of rotation matrix from body frame to
LLF
. The acceleration
in the
LLF
can be obtained by transforming the specific force
to the
LLF
and removing the local gravity, the Coriolis force
of Earth rotation and the influence caused by the change of
orientation of the local-level frame. This explains the second
equation in Equation 1. With acceleration in the
LLF
, the ve-
locity and position update can be achieved. The 15-parameter
error-state model for
INS
can be obtained by linearization of
the mechanization equations, given as:
δ
δ
ε
δ
δω
δ
δ
ε
δ
δω
r
V
f
F
r
V
f
l
l
l
b
b
l
l
l
l
b
b
=
+ ⋅ =
×
×
×
×
G w
F F
F F F R
rr
rV
Vr
VV V b
l
0 0 0
0
3 3 3 3 3 3
3 3
ε
F F F
R
r
V
b
l
ε
ε
εε
α
β
0
0 0 0
0
0 0 0 0
3 3
3 3 3 3 3 3
3 3
3 3 3 3 3 3 3 3
×
×
×
×
×
×
×
×
×
+ ⋅
δ
δ
ε
δ
δω
r
V
f
G w
l
l
l
b
b
(3)
where
δ
represents the errors of corresponding parameters,
ε
l
represents the attitude error in
LLF
, namely pitch, roll
and yaw errors,
α
and
β
are 3 × 3 matrices representing the
reciprocal of the correlation time of accelerometers and gyros
respectively,
G
is the shaping matrix, and
ω
represents the
driving white noise for the state vector. The inertial sensor er-
rors are models as first order Gauss-Markov process.
F
l
is the
dynamics matrix used to derive the transition matrix in
EKF
in
the tightly coupled integration. The sub-matrices in Equation
3 have been detailed by Noureldin
et al
., 2012).
Visual Odometry Related Formulations in Tight Integration
In visual odometry, the ego-motion can be estimated using the
tracked or matched corresponding features of the two consec-
utive epochs. The approaches of ego-motion estimation can
be summarized as three main categories, namely 2D to 2D, 3D
to 3D, and 3D to 2D. In the 2D to 2D method, the rotation and
translation can be decomposed from the essential matrix for
two consecutive epochs (Nistér, 2004). The 3D to 3D method
computes the ego-motion based on the two sets of points ob-
tained by triangulation of two pairs of stereo images (Arun
et
al
., 1987). The 3D to 2D method minimizes the re-projection
error of the 3D feature coordinates obtained by previous trian-
gulation, which is also called
PnP
(perspective from
n
points)
(Moreno-Noguer
et al
., 2007).
16
January 2018
PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING
I...,6,7,8,9,10,11,12,13,14,15 17,18,19,20,21,22,23,24,25,26,...54
Powered by FlippingBook