LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Image Inpainting Based on Patch-GANs

Photo from wikipedia

In this paper, we propose a novel image inpainting framework that takes advantage of holistic and structure information of the broken input image. Different from the existing models that complete… Click to show full abstract

In this paper, we propose a novel image inpainting framework that takes advantage of holistic and structure information of the broken input image. Different from the existing models that complete the broken pictures using the holistic features of the input, our method adopts Patch-generative adversarial networks (GANs) equipped with multi-scale discriminators and edge process function to extract holistic, structured features, and restore the damaged images. After pre-training our Patch-GANs, the proposed network encourages our generator to find the best encoding of the broken input images in the latent space using a combination of a reconstruction loss, an edge loss, and global and local guidance losses. Besides, the reconstruction and the global guidance losses ensure the pixel reliability of the generated images, and the remaining losses guarantee the contents consistency between the local and global parts. The qualitative and quantitative experiments on multiple public datasets show that our approach has the ability to produce more realistic images compared with some existing methods, demonstrating the effectiveness and superiority of our method.

Keywords: image inpainting; based patch; inpainting based; image; patch gans

Journal Title: IEEE Access
Year Published: 2019

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.