Bound-Constrained Multiple-Image Least-Squares
Matching for Multiple-Resolution Images
Han Hu and Bo Wu
Abstract
Satellite images from multiple sources with different resolu-
tions are currently able to observe the same region. Reliable
image matching between these images is the first step in their
integrated use. Image matching of multiple-resolution images
is not trivial because of the large geometric differences among
the images, which can cause failure of matching and losses of
matching accuracy. This paper presents a bound-constrained,
multiple-image, least-squares matching (
LSM
) method that
extends the classical
LSM
in two ways for better performance.
First, the a priori metadata of the images, including the geo-
referencing and scale information, are used for initial match-
ing and to provide bound constraints in the
LSM
to improve
its stability. Second, multiple images are matched in a single
optimization rather than the traditional pairwise matching.
This brings additional observations in the least-squares opti-
mization, which makes the matching aware of both larger and
local context and improves matching quality even with inac-
curate initializations for high resolution images. Experimental
analysis using multiple-source satellite images with multiple
resolutions collected on Mars and in Hong Kong reveals
that the proposed method is capable of obtaining reliable
multiple-fold matches effectively, even in challenging cases
with resolution differences as much as 20-fold. The method
proposed in this paper has significance for the synergistic use
of multiple-source satellite images in various applications.
Introduction
Recent advances in satellite imaging have provided great op-
portunities for the same region to be covered by images from
multiple sources with multiple resolutions, both on Earth
(Elaksher and Alharthy, 2011; Tang
et al
., 2016) and on other
planets (Wu
et al
., 2014; Tao
et al
., 2016). These images can
be integrated for synergistic use in a variety of applications.
For example, images with sub-meter-level resolution are suit-
able for detailed topographic mapping (Qiao
et al
., 2010) or
hazard monitoring (Corbane
et al
., 2011; Voigt
et al
., 2011),
and images with relatively coarse resolution feature wide
coverage and are suitable for topographic mapping of large
areas (Xue
et al
., 2015; Zhang
et al
., 2015). However, geomet-
ric inconsistencies at different levels among these images are
inevitable because they were collected by different sensors
with different characteristics onboard different platforms.
These multiple-source images must be co-registered before
their synergistic use, and reliable image matching between
them is a key step in this process.
Despite the milestone image-matching algorithm of
SIFT
(Scale Invariant Feature Transform) (Lowe, 2004), matching
of multiple-resolution images is still an open problem that is
actively being studied (Sedaghat and Ebadi, 2015). The prob-
lems are caused by the scale differences between the images,
which have two consequences: (a) a failure to find matches,
which is limited by the performance of the existing feature
detectors and descriptors; and (b) a loss of matching accuracy,
which is introduced in the sampling steps when matching
images with different scales. A greater difference in scale
between the images leads to more serious problems.
Lerma
et al
. (2013) presented a general workflow for
multiple-resolution image matching that comprised two
steps. The first step is initial matching between the images
using
SIFT
-like descriptors, and the second is to locate initial
matches at the sub-pixel level through the well-established
least-squares matching (
LSM
) method (Gruen, 1985). Theoreti-
cally,
LSM
is fully invariant to affine deformations, which is
a superset of the scale difference; however, in practice,
LSM
sometimes suffers from convergent issues when scale differ-
ence is larger than 30 percent. Furthermore, the nonlinear na-
ture of its solver means that it requires similar image quality
or the solver will diverge (Gruen, 2012). The two drawbacks
of
LSM
have limited its use in the scenario of matching of im-
ages with multiple resolutions, in which the scale differences
of the images may be as much as 20-fold.
To surmount the widely acknowledged disadvantages
of
LSM
described above for reliable matching of multiple-
resolution images, this paper presents a bound-constrained
multiple-image
LSM
method that renovates the traditional
LSM
in two ways: (a) following previous work on improving the
stability of
LSM
using bound constraints (Hu
et al
., 2016), with
the information of image resolutions, we extend the bound
constrained solver into multiple image matching by directly
matching in the original image space rather than rectified im-
ages. The bound constraints enforce the affine parameters not
to exceed a certain range with physical significance and thus
increase the reliability of the iterative solver; (b) Instead of
traditional pairwise formulation of the
LSM
, this method com-
bines all of the images (typically four images of two stereo
pairs with different resolutions) into a single optimization.
Using carefully designed parameter sets and the minimization
formulation of
LSM
, we achieve both matching reliability be-
tween images of different resolutions and accuracies between
images of the high resolution images at the same time.
The remainder of this paper is organized as follows. The
next section presents a literature review of matching between
images collected by different sensors and with multiple
resolutions; followed by the revisits of the classical
LSM
and
the bound-constrained multiple-image. Next,
LSM
is then
Han Hu is with the Department of Land Surveying and Geo-
Informatics, The Hong Kong Polytechnic University, Hung
Hom, Kowloon, Hong Kong; and Faculty of Geosciences and
Environmental Engineering, Southwest Jiaotong University,
Chengdu, P.R. China.
Bo Wu is with the Department of Land Surveying and Geo-
Informatics, The Hong Kong Polytechnic University
(
).
Photogrammetric Engineering & Remote Sensing
Vol. 83, No. 10, October 2017, pp. 667–677.
0099-1112/17/667–677
© 2017 American Society for Photogrammetry
and Remote Sensing
doi: 10.14358/PERS.83.10.667
PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING
October 2017
667