LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

EMOCGAN: a novel evolutionary multiobjective cyclic generative adversarial network and its application to unpaired image translation

Photo from wikipedia

Generative adversarial networks (GANs) have been accepted as powerful models in the field of computer vision, speech and language processing, etc. However, a major concern regarding GANs is the requirement… Click to show full abstract

Generative adversarial networks (GANs) have been accepted as powerful models in the field of computer vision, speech and language processing, etc. However, a major concern regarding GANs is the requirement of paired images for image-to-image translation, which is not always possible in the case of real-world applications. Moreover, they also suffer from training instability as well as mode collapse problem. These concerns remain open challenging issues for GANs and become more complex in the case of cyclic GAN. Motivated by evolutionary GAN, we hereby propose a novel evolutionary multiobjective cyclic GAN (EMOCGAN) to address the above stated challenges related to cyclic GAN training for the image-to-image translation. In this work, we have also introduced a new approach for model training by integrating the concept of evolutionary computation, multiobjective optimization, cyclic GAN along with different selection mechanisms. To overcome local optima stagnation, metropolis acceptance criteria and Pareto-based selection on two scores (objective functions) are utilized. Evolutionary concepts in training helped to address the instability and mode collapse problems. Extensive experiments on real-world image datasets show that the EMOCGAN outperforms state-of-the-art method in terms of visually realistic appearance and retaining background information as well as salient objects. Quantitative comparisons of our EMOCGAN with the cyclic GAN also show significantly better scores obtained by our model, as indicated by structural similarity index (SSIM) and universal quality index (UQI). The model demonstrated best efficacy in terms of SSIM on Apple $$\leftrightarrow$$ Orange, while it shows higher UQI values for Monet $$\leftrightarrow$$ Picture and Summer $$\leftrightarrow$$ Winter datasets.

Keywords: generative adversarial; novel evolutionary; image; cyclic gan; image translation

Journal Title: Neural Computing and Applications
Year Published: 2021

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.