ABSTRACT This paper focuses on spatial quality assessment of pan-sharpened imagery that contains valuable information of input images. Its aim is to show that fusion functions respond differently to different… Click to show full abstract
ABSTRACT This paper focuses on spatial quality assessment of pan-sharpened imagery that contains valuable information of input images. Its aim is to show that fusion functions respond differently to different types of landscapes. It compares a quality assessment of an object-level procedure with that of a conventional pixel-level-based procedure which assigns uniform quality scores to all image pixels of pan-sharpened images. To do so, after performing a series of pan-sharpening evaluations, a weighted procedure for spatial quality assessments of pan-sharpening products, allocating spatially varying weight factors to the image pixels proportional to their level of spatial information content is proposed. All experiments are performed using five high-resolution image datasets using fusion products produced by three common pan-sharpening algorithms. The datasets are acquired from WorldView-2, QuickBird, and IKONOS. Experimental results show that the spatial distortion of fused images for the class vegetation cover exceeds that of man-made structures, reaching more than 4% in some cases. Our procedure can preclude illogical fidelity estimations occurring when pan-sharpened images contain different land covers. Since particular image structures are of high importance in remote sensing applications, our procedure provides a purpose-oriented estimation of the spatial quality for pan-sharpened images in comparison with conventional procedures.
               
Click one of the above tabs to view related content.