LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Fine Tuning of Deep Contexts Toward Improved Perceptual Quality of In-Paintings.

Photo from wikipedia

Over the recent years, a number of deep learning approaches are successfully introduced to tackle the problem of image in-painting for achieving better perceptual effects. However, there still exist obvious… Click to show full abstract

Over the recent years, a number of deep learning approaches are successfully introduced to tackle the problem of image in-painting for achieving better perceptual effects. However, there still exist obvious hole-edge artifacts in these deep learning-based approaches, which need to be rectified before they become useful for practical applications. In this article, we propose an iteration-driven in-painting approach, which combines the deep context model with the backpropagation mechanism to fine-tune the learning-based in-painting process and hence, achieves further improvement over the existing state of the arts. Our iterative approach fine tunes the image generated by a pretrained deep context model via backpropagation using a weighted context loss. Extensive experiments on public available test sets, including the CelebA, Paris Streets, and PASCAL VOC 2012 dataset, show that our proposed method achieves better visual perceptual quality in terms of hole-edge artifacts compared with the state-of-the-art in-painting methods using various context models.

Keywords: quality; tuning deep; deep contexts; perceptual quality; fine tuning; contexts toward

Journal Title: IEEE transactions on cybernetics
Year Published: 2021

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.