This work focuses on the generation of three-dimensional (3D)-scene information as well as the fusion of real and virtual 3D scene information for the full-parallax holographic stereogram based on the… Click to show full abstract
This work focuses on the generation of three-dimensional (3D)-scene information as well as the fusion of real and virtual 3D scene information for the full-parallax holographic stereogram based on the effective perspective images' segmentation and mosaicking (EPISM) method. The improved depth-image-based rendering (DIBR) method was used to generate the virtual viewpoint images of the real 3D scene, and the regularization and densification processing models of the degraded light field were established; as a result, the real sampling-light field was reconstructed. Combined with the computer-rendered virtual 3D scene information, a "real + virtual" light-field fusion method based on a pixel-affine-projection was proposed to realize the fusion of the real and virtual 3D scene. The fusion information was then processed by the EPISM encoding and was then holographically printed. The optical experiment results showed that the full-parallax holographic stereogram with the real-virtual scene-fused 3D scenes could be correctly printed and reconstructed, which validated the effectiveness of our proposed method.
               
Click one of the above tabs to view related content.