Depth-image-based-rendering (DIBR) techniques are significant for 3D video applications, e.g., 3D television and free viewpoint video (FVV). Unfortunately, the DIBR-synthesized image suffers from various distortions, which induce an annoying viewing… Click to show full abstract
Depth-image-based-rendering (DIBR) techniques are significant for 3D video applications, e.g., 3D television and free viewpoint video (FVV). Unfortunately, the DIBR-synthesized image suffers from various distortions, which induce an annoying viewing experience for the entire FVV. Proposing a quality evaluator for DIBR-synthesized images is fundamental for the design of perceptual friendly FVV systems. Since the associated reference image is usually not accessible, full-reference (FR) methods cannot be directly applied for quality evaluation of the synthesized image. In addition, most traditional no-reference (NR) methods fail to effectively measure the specifically DIBR-related distortions. In this paper, we propose a novel NR quality evaluation method accounting for two categories of DIBR-related distortions, i.e., geometric distortions and sharpness. First, the disoccluded regions, as one of the most obvious geometric distortions, are captured by analyzing local similarity. Then, another typical geometric distortion (i.e., stretching) is detected and measured by calculating the similarity between it and its equal-size adjacent region. Second, considering the property of scale invariance, the global sharpness is measured as the distance between the distorted image and its downsampled version. Finally, the perceptual quality is estimated by linearly pooling the scores of two geometric distortions and sharpness together. Experimental results verify the superiority of the proposed method over the prevailing FR and NR metrics. More specifically, it is superior to all competing methods except APT in terms of effectiveness, but greatly outmatches APT in terms of implementation time.
               
Click one of the above tabs to view related content.