12-19 December Full - page 637

A Two-Stage Spatiotemporal Fusion Method for
Remote Sensing Images
Yue Sun and Hua Zhang
Abstract
This paper presents a two-stage spatiotemporal fusion
method for obtaining dense remote sensing images with both
high spatial and temporal resolution. Considering the large
resolution differences between fine- and coarse-resolution
images, the proposed method is implemented in two stages.
In the first stage, the input fine- and coarse-resolution images
are preprocessed to the same intermediate resolution images,
respectively. Then, a linear interpolation model is introduced
to fuse these resampled images for predicting preliminary
fusion results. In the second stage, a residual dense network
is used to learn the nonlinear mapping between the pre-
liminary fusion results and the real fine-resolution data to
reconstruct the final fine-resolution data. Two data sets with
different land surface types are employed to test the perfor-
mance of the proposed method. Experimental results show
that the proposed method is advantageous in such areas
with phenological changes, and even for the data sets with
land cover changes being the main type, it still has a good
ability to predict spatial structure information of images.
Introduction
Remote Sensing images with frequent coverage and high spa-
tial resolution are highly desired for Earth observation, such
as real-time monitoring of vegetation phenology, urban con-
struction, and geological hazards (Zhu
et al.
2018; Zheng
et
al.
2016; Walker, De Beurs, and Wynne 2014; Xian and Crane
2005; Li
et al.
2016). However, due to the limitation of sensor
hardware and budgets, it is still difficult in acquiring remote
sensing data with both high spatial and temporal resolutions
from current available satellite images. F
satellites have provided remote sensing
resolution of 30 m for long-term surface
1972. However, the 16-day revisit period of Landsat satellites
has greatly limited its application in the rapid changes of the
surface, such as plant growth season, crop growth monitor-
ing, and seasonal ecosystem disturbance detection, and the
continuous cloud pollution greatly limits the availability of
Landsat data (Gao
et al.
2006; Townshend
et al.
1991). The
Moderate Resolution Imaging Spectroradiometer (
MODIS
) on
Terra satellite can realize high-frequency observation of the
surface. The low spatial resolution of
MODIS
data (250 m to
1000 m) makes it inadequate to monitor the surface quan-
titatively, but mostly used in the study of large-scale areas
(Guenther
et al.
2002). Spatio temporal fusion technology
has the superiority in integrating remote sensing data from
different sensors, scales, and time phases without changing
the existing observation conditions. It offers a feasible way to
produce data with both high spatial and temporal resolution
(Chen, Huang, and Xu 2015). Recently, many spatiotemporal
fusion methods have been proposed. In general, they can be
classified into three categories: 1) filtered based, 2) unmixing
based, 3) learning based (Song
et al.
2018).
The filter based fusion algorithms mainly consider incor-
porating neighborhood information for calculation. One of the
most popular filter based algorithms is the well-known spatial
and temporal adaptive reflectance fusion model (
STARFM
) (Gao
et al.
2006). Though
STARFM
can accurately capture the sur-
face reflectance for homogeneous landscape, the prediction
accuracy will be reduced in the case of high surface heteroge-
neity or abrupt change.
STARFM
’s performance for land surface
changes was later improved by the spatial temporal adap-
tive algorithm for mapping reflectance changes (STAARCH)
(Hilker
et al.
2009). However, this algorithm is still sensitive
to surface heterogeneity. To deal with this limitation, Zhu
et
al.
(2010) developed an enhanced spatial and temporal adap-
tive reflectance fusion model (
ESTARFM
) to improve the fusion
accuracy of
STARFM
in complex and heterogeneous surface
regions. Cheng
et al.
(2017) proposed a spatial and temporal
nonlocal filter-based fusion model (
STNLFFM
), which makes
full use of the high spatial and temporal redundancy infor-
mation in remote sensing images. It achieves higher predic-
tion ability and accuracy than
STARFM
and
ESTARFM
in areas
with high surface heterogeneity. In order to further improve
fusion algorithms’ performance for different surface changes,
Wang and Atkinson (2018) proposed a new three-stage fusion
method, which includes three parts: regression model fitting
(
RM
fitting), spatial filtering (
SF
), and residual compensation
(
RC
). This model has a good ability to capture the temporal
changes of images acquired in different periods. Zhao, Huang,
and Song (2018) established a robust adaptive spatial and
temporal fusion model (
RASTFM
) for complex land surface
change, which classified land surface change into surface
cover change with shape change and surface cover change
for prediction. Results show that this
curacy and robustness in capturing the
change.
The unmixing-based fusion algorithms assume there is
no land-cover changes between the input image and the
predicted image, and prior classification results from a fine-
resolution image are required. Generally, spatial unmixing
was applied to compute the endmember of coarse pixels and
estimate the fined pixels using weighted endmembers (Zhu-
kov
et al.
1999; Zurita-Milla, Clevers, and Schaepman 2008;
Amorós-López
et al.
2013). More recently, unmixing-based
methods have been modified and improved for more accu-
rate prediction. Gevaert and García-Haro (2015) combined
the advantages of
STARFM
and unmixing-based algorithms to
propose a novel spatial and temporal reflectance unmixing
model (
STRUM
), which directly decomposes the difference
between two coarse images to estimate endmember changes,
and applies Bayesian formulation to optimize the estimation.
In order to improve
STARFM
’s performance in heterogeneous
surface areas, an unmixing-based
STARFM
(
USTARFM
) was
School of Environment Science and Spatial Informatics,
China University of Mining and Technology, Xuzhou, Jiangsu,
China (
).
Photogrammetric Engineering & Remote Sensing
Vol. 85, No. 12, December 2019, pp. 907–914.
0099-1112/19/907–914
© 2019 American Society for Photogrammetry
and Remote Sensing
doi: 10.14358/PERS.85.12.907
PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING
December 2019
907
581...,627,628,629,630,631,632,633,634,635,636 638,639,640,641,642,643,644,645,646,647,...648
Powered by FlippingBook