Due to the complexity of backscattering mechanisms in built-up areas, the synthetic aperture radar (SAR)-based mapping of floodwater in urban areas remains challenging. Open areas affected by flooding have low… Click to show full abstract
Due to the complexity of backscattering mechanisms in built-up areas, the synthetic aperture radar (SAR)-based mapping of floodwater in urban areas remains challenging. Open areas affected by flooding have low backscatter due to the specular reflection of calm water surfaces. Floodwater within built-up areas leads to double-bounce effects, the complexity of which depends on the configuration of floodwater concerning the facades of the surrounding buildings. Hence, it has been shown that the analysis of interferometric SAR coherence reduces the underdetection of floods in urbanized areas. Moreover, the high potential of deep convolutional neural networks for advancing SAR-based flood mapping is widely acknowledged. Therefore, we introduce an urban-aware U-Net model using dual-polarization Sentinel-1 multitemporal intensity and coherence data to map the extent of flooding in urban environments. It uses a priori information (i.e., an SAR-derived probabilistic urban mask) in the proposed urban-aware module, consisting of channel-wise attention and urban-aware normalization submodules to calibrate features and improve the final predictions. In this study, Sentinel-1 single-look complex data acquired over four study sites from three continents have been considered. The qualitative evaluation and quantitative analysis have been carried out using six urban flood cases. A comparison with previous methods reveals a significant enhancement in the accuracy of urban flood mapping: the F1 score of flooded urban increased from 0.3 to 0.6 with few false alarms in urban area using our method. Experimental results indicate that the proposed model trained with limited datasets has strong potential for near-real-time urban flood mapping.
               
Click one of the above tabs to view related content.