Emergent Mind

Improving GANs with a Feature Cycling Generator

(2210.09638)
Published Oct 18, 2022 in cs.CV and cs.AI

Abstract

Generative adversarial networks (GANs), built with a generator and discriminator, significantly have advanced image generation. Typically, existing papers build their generators by stacking up multiple residual blocks since it makes ease the training of generators. However, some papers commented on the limitation of the residual block and proposed a new architectural unit that improves the GANs performance. Following this trend, this paper presents a novel unit, called feature cycling block (FCB), which achieves impressive results in the image generation task. Specifically, the FCB has two branches: one is a memory branch and the other is an image branch. The memory branch keeps meaningful information at each stage of the generator, whereas the image branch takes some useful features from the memory branch to produce a high-quality image. To show the capability of the proposed method, we conducted extensive experiments using various datasets including CIFAR-10, CIFAR-100, FFHQ, AFHQ, and subsets of LSUN. Experimental results demonstrate the substantial superiority of our approach over the baseline without incurring any objective functions or training skills. For instance, the proposed method improves Frechet inception distance (FID) of StyleGAN2 from 4.89 to 3.72 on the FFHQ dataset and from 6.64 to 5.57 on the LSUN Bed dataset. We believe that the pioneering attempt presented in this paper could inspire the community with better-designed generator architecture and with training objectives or skills compatible with the proposed method.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.