LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

SCGN: novel generative model using the convergence of latent space by training

Photo from wikipedia

Generative models such as variational autoencoders (VAEs) and generative adversarial networks (GANs) have been recently applied to various fields. However, the VAE and GAN models have blur and mode collapse… Click to show full abstract

Generative models such as variational autoencoders (VAEs) and generative adversarial networks (GANs) have been recently applied to various fields. However, the VAE and GAN models have blur and mode collapse problems, respectively. Here, the authors propose a novel generative model, self-converging generative network (SCGN), to address the issues. Self-converging means the convergence of latent vectors into themselves through being trained in pairs with training data, by which the SCGN can reconstruct all training data. In the authors' model, the latent vectors and weights of the generator are alternately trained. Specifically, the latent vectors are trained to follow a normal distribution, using a loss function derived from the Kullback-Leibler divergence and a pixel-wise loss. The weights of the generator are adjusted for the generator to produce training data by means of a pixel-wise loss. As a result, their SCGN did not fall into the mode collapse, which occurs in GANs, and made clearer images than VAEs thanks to no use of sampling. Moreover, the SCGN successfully learned the manifold of the dataset in the extensive experiments with CelebA.

Keywords: convergence latent; generative model; novel generative; latent vectors

Journal Title: Electronics Letters
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.