Author: Wang, Jingtao
Title: A simple and effective method for fusing satellite time-series data with different resolutions
Advisors: Zhu, Xiaolin (LSGI)
Degree: M.Sc.
Year: 2019
Subject: Image processing -- Digital techniques
Remote sensing
Hong Kong Polytechnic University -- Dissertations
Department: Faculty of Construction and Environment
Pages: viii, 88 pages : color illustrations
Language: English
Abstract: Satellite remote sensors provide consistent and repeatable measurements of the ground surface to capture changes caused by nature (e.g., fire, flood) and human activities (e.g., deforestation, urbanization, agriculture). Under the constraints of the hardware of existing satellite sensors, it is impossible to provide data with high resolutions in both space and time. The spatiotemporal fusion of remote sensing images is a highly feasible, low-cost, and flexible solution to generate images with both high temporal and spatial resolutions. Spatiotemporal fusion combines two types of satellite images, one with low spatial resolution but the high temporal resolution (e.g., MODIS / VIIRS / GOES-R) and another one with low temporal resolution data but the high spatial resolution (e.g., Landsat / Sentinel-2). However, existing spatiotemporal fusion methods were designed to generate a single image from multiple input images. As more and more free satellite images are available, there is a need to fuse time-series data with different resolutions to generate a new time series with both high spatial and temporal resolutions. Existing spatiotemporal fusion methods are inefficient in time-series fusion.
To address this limitation, this study proposes an automated spatiotemporal time-series fusion (ASTF) method. First, a harmonic model was used to build a time series model for each coarse pixel by fitting all coarse observations and the upscaled fine observations. This time series model was used to reconstruct all the coarse-resolution images. Experiments show that this time series model improves the quality of the original coarse-resolution time-series images. Then, the difference between all fine and reconstructed coarse image pairs (i.e., one pair consists of two images collected on the same day) was calculated. This difference represents the spatial details contained in fine images. We assume that this spatial-detail information changes gradually with time, so a polynomial model was used to build a spatial-detail time-series model. Finally, a combination of the coarse-resolution time-series model and the spatial-detail time-series model can generate a high spatial resolution image in any given day. Real MODIS and sentinel-2 time-series data were used to test the performance of the proposed ASTF method. The experiment proved that the accuracy of fusion results is high when the clear fine images are sufficient and evenly distributed across the fusion period. Compared to the existing single-pair or multi-pair fusion methods, ASTF does not need to select input image pairs manually, so the efficiency of synthesizing fine images is greatly improved. ASTF has a simple principle and no Specific requirement of the bands, so it can be easily applied to fuse various raw satellite time-series data or derived products (e.g., vegetation indices and land surface temperature).
Rights: All rights reserved
Access: restricted access

Files in This Item:
File Description SizeFormat 
991022385444803411.pdfFor All Users (off-campus access for PolyU Staff & Students only)5.23 MBAdobe PDFView/Open


Copyright Undertaking

As a bona fide Library user, I declare that:

  1. I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
  2. I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
  3. I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.

By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.

Show full item record

Please use this identifier to cite or link to this item: https://theses.lib.polyu.edu.hk/handle/200/10565