PE&RS November 2019 Full - page 833

Road lamps: highest
Road lanes
highest
no.
:
i
d
i
d
i
d
i
i
i
i
& &
& &
& &
>
+
<
<
+
0
1
0
0
1
no.
lowest
no.
no.
& &d
i
>
0
(6)
The Registration Model and Mismatching Feature Points Elimination
Panoramic Image Registration Model
The panoramic camera is composed of multiple lenses, and the
images from different lenses are stitched together based on a
fixed projection (Zhu 2019). Equation 7 shows the panoramic
camera model using the spherical projection; the Y axis is per-
pendicular to the image plane (XOZ) (Zhu
et al.
2018):
r
Z X Y D
c
X Y D
=
+
)
=
(
)
tan /
tan /
1
2 2
1
(7)
where [
X
Y
Z
]
T
=
R
·[
x
X
s
y
Y
s
z
Z
s
]
T
,
R
is the rotation
angle matrix about (
r
x
r
y
r
z
), (x y z) are the coordinates of an
object, D is used to convert the radians and image sizes (unit:
pixels/rad), and (
r c
) are the row and column coordinates of
the panoramic image, respectively.
The registration parameters can be solved accurately using
feature points. Equation 8 is transformed from Equation 7; (
v h
)
are the vertical and horizontal angles, respectively. Equation 8
can be calculated by the direct linear transformation method:
Z Y v
h
X Y h
/
tan / cos
/
tan
=
=

(8)
where
v
=
r
/
row
·
π
,
h
=
c
/
col
·2
π
, and
row
and
col
are the rows
and columns of the panoramic image, respectively.
Let
pr
= tan
v
/cos
h
and
pc
= tan
h
. Equation 9 is trans-
formed from Equation 8:
x l y l z l l
x l y l
z l
x l y l z l l
x l
⋅ + ⋅ + ⋅ −
⋅ + ⋅
+ ⋅
=
⋅ + ⋅ + ⋅ −
1
2
3 4
9
10
11
5
6
7 8
1
9
10
11
1
+ ⋅
+ ⋅
=

y l
z l
pc
(9)
where
l
l
t
X
Y
Z
l
l
l
l l
l
l
l
l
S
S
S
8
4
5 6 7
9 10 11
1 2
1
= ⋅ ⋅
R
,
3
= ⋅
t
R
Then the adjustment model can be expressed as
A
2
m
×11
·
X
11×1
=
B
2
m
×1
(10)
where
m
expresses the number of feature points,
X
= [
l
1
l
2
l
3
l
4
l
5
l
6
l
7
l
8
l
9
l
10
l
11
]
T
, and
A
=
− − −
− − −
x y z
pr x pr y pr z
x y z
1 1 1
1
1
1
1 1 1
10 000
0000
1
pc x pc y pc z
x y z
pr x pr y pr z
m m m
m m m
1
1
1
10 000
0
− − −
000
1
1
− − −
=
x y z pc x pc y pc z
pr
p
m m m m m m
, B
c
pr
pc
m
m
1
,
The matrix (
X
) to be solved includes 11 unknown param-
eters in the adjustment model; therefore, at least six feature
points are needed. Then, bringing
X
into Equation 9 to
calculate (
pr pc
), all
LiDAR
points can be projected onto the
panoramic image, as shown in Equation 11:
r
pr
pc
row
c
pc
col
'
/
/
'
/
=
+
)
=
( )
( )

atan
atan
1
2
2
π
π
(11)
The registration error (
δ
), calculated by Equation 12, is
regarded as the precision index, and (
r
i
c
i
) and (
r
i
c
i
), the co-
ordinates of feature points, are from the panoramic image and
Equation 11, respectively:
=
=
i
i
m r
δ
δ
δ
(
)
+ −
(
)


=
i
m
i
i
i
i
r
c c
1
2
2
2
/ ,
'
'
(12)
Mismatching Feature Point Elimination
It is necessary to automatically eliminate mismatching feature
points in the registration. The process is as follows:
1. All feature points within distance are used to solve the
registration model; then, calculating
δ
i
according to Equa-
tion 12, if all
δ
i
(
i
= 1, 2 …
m
) < 20 pixels, the registration
is completed; otherwise,
2. eliminating one feature point to solve and having
m
sets of
parameters, calculating
δ
and
δ
i
(
i
= 1, 2 …
m
− 1) with
m
1 feature points and
δ
and
δ
i
(
i
= 1, 2 …
m
) with all feature
points in each set, if
δ
i
< 20 pixels and
δ
i
< 200 pixels, the
smallest
δ
and
δ
are selected as the registration result;
otherwise,
3. eliminating two feature points and having
m
·(
m
− 1)/2 sets
of parameters, the optimal parameters are selected accord-
ing to step 2;
4. and so on. Different numbers of mismatching feature
points can be eliminated automatically in the registration
of the panoramic image sequence.
sis
Registration with Different Methods
The original registration method (method I), based on Equa-
tion 7, was used to compare our registration method (method
II); the registration parameters (
X
S
,
Y
S
,
Z
S
,
r
x
,
r
y
,
r
z
) were
obtained by
GPS/IMU
. Figure 9 shows the registration results
of the no. 12 panoramic image with the two methods. Figure
10 shows four local images from Figure 9 to compare the
registration effect.
Figures 9 and 10 show the registration effect of the pan-
oramic image with the two methods. There are obvious dis-
locations in the billboards, road lamps, trees, and buildings
with method I, but method II can obtain the ideal registration
effect; all objects have high coincidence. To quantitatively
compare the accuracy of methods I and II, Table 1 lists
δ
i
of
each feature point, and Table 2 shows the analysis of
δ
i
with
the two methods.
As shown in Tables 1 and 2, 14 feature points (
m
) were
extracted from
LiDAR
points and the no. 12 panoramic image,
and 13 feature points (
n
) were used for registration after mis-
matching points filtering; methods I and II have an accuracy
of 0.74% and 0.11% (
δ
/image diagonal), respectively; our
method is greater than that of method I. The minimum error is
43.91 pixels (
δ
2
) in method I, and the maximum error is 19.31
PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING
November 2019
833
775...,823,824,825,826,827,828,829,830,831,832 834,835,836,837,838,839,840,841,842,843,...854
Powered by FlippingBook