|Title:||Accuracy assessment for spatiotemporal data fusion|
|Advisors:||Zhu, Xiaolin (LSGI)|
|Subject:||Online social networks -- China -- Hong Kong|
Spatial behavior -- Social aspects -- Data processing
Hong Kong Polytechnic University -- Dissertations
|Department:||Faculty of Construction and Environment|
Department of Land Surveying and Geo-Informatics
|Pages:||x, 125 pages : color illustrations|
|Abstract:||Dense time-series satellite imagery with high spatial resolution is especially significant in remote sensing studies. However, existing satellite sensors are difficult to acquire the images with both high temporal and spatial resolution. With the increasing demands of dense time-series satellite imagery with high spatial resolution, spatiotemporal data fusion has experienced a rapid development in these years. Accuracy assessment is an essential part in spatiotemporal data fusion, but there is no agreement on which spatiotemporal image fusion method performs best up to present. In General, accuracy assessment of spatiotemporal data fusion is to evaluate the spectral reflectance value and the spatial information similarity between the fused and a true image. The accuracy of the fused images and the performances of the spatiotemporal data fusion methods are evaluated by the indices that more focus on the spectral reflectance accuracy in existing studies, which cannot assess the spatial information aspect of the fused images and the performance on reserving spatial details of the algorithms. As different metrics have different strength for assessment, it is therefore of great urgency to find a combo of indices which can well evaluate the accuracy of fused images in both spectral and spatial domains. This study first simulated 6 types of fused images based on the categories of existing spatiotemporal data fusion methods, and then tested the performance of 13 metrics for assessing accuracy of the simulated fused images. these 13 metrics include the five widely used ones (correlation coefficient, structural similarity index (SSIM), root mean square error (RMSE), average absolute difference (AAD), average difference (AD)), a simple yet flexible index (SIFI), four textural indices (the difference of Moran's I and contrast, entropy, standard deviation between the fused image and true image), as well as F1-score, producer and user accuracy of the edge detection between the fused and true images. The results suggest that (1) the existing widely used indices and SIFI are more suitable to assess the spectral reflectance error, while the textural and edge assessment can capture the spatial information better; (2) the optimal set consists of 4 indices, correlation coefficient, RMSE, difference of contrast and F1-score, which can cover the accuracy assessment in spectral and spatial information aspects; (3) the proposed optimal set is further applied to compare four typical spatiotemporal fusion methods. Results show that the four indices in the optimal set can well distinguish the strengths of different fusion methods. Findings in this study are helpful for the data fusion field to develop a standard accuracy assessment approach and facilitates the further development of new spatiotemporal data fusion algorithms.|
|Rights:||All rights reserved|
Files in This Item:
|991022385340503411.pdf||For All Users (off-campus access for PolyU Staff & Students only)||6.73 MB||Adobe PDF||View/Open|
As a bona fide Library user, I declare that:
- I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
- I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
- I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.
By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.
Please use this identifier to cite or link to this item: