LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Combining spatiotemporal fusion and object-based image analysis for improving wetland mapping in complex and heterogeneous urban landscapes

Photo by hellocolor from unsplash

Abstract Remote sensing has been proven promising in wetland mapping. However, conventional methods in a complex and heterogeneous urban landscape usually use mono temporal Landsat TM/ETM + images, which have… Click to show full abstract

Abstract Remote sensing has been proven promising in wetland mapping. However, conventional methods in a complex and heterogeneous urban landscape usually use mono temporal Landsat TM/ETM + images, which have great uncertainty due to the spectral similarity of different land covers, and pixel-based classifications may not meet the accuracy requirement. This paper proposes an approach that combines spatiotemporal fusion and object-based image analysis, using the spatial and temporal adaptive reflectance fusion model to generate a time series of Landsat 8 OLI images on critical dates of sedge swamp and paddy rice, and the time series of MODIS NDVI to calculate phenological parameters for identifying wetlands with an object-based method. The results of a case study indicate that different types of wetlands can be successfully identified, with 92.38%. The overall accuracy and 0.85 Kappa coefficient, and 85% and 90% for the user’s accuracies of sedge swamp and paddy respectively.

Keywords: heterogeneous urban; object based; complex heterogeneous; wetland mapping; fusion; spatiotemporal fusion

Journal Title: Geocarto International
Year Published: 2019

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.