Triple Generative Adversarial Nets

نویسندگان

  • Chongxuan Li
  • Taufik Xu
  • Jun Zhu
  • Bo Zhang
چکیده

Generative Adversarial Nets (GANs) have shown promise in image generation and semi-supervised learning (SSL). However, existing GANs in SSL have two problems: (1) the generator and the discriminator (i.e. the classifier) may not be optimal at the same time; and (2) the generator cannot control the semantics of the generated samples. The problems essentially arise from the two-player formulation, where a single discriminator shares incompatible roles of identifying fake samples and predicting labels and it only estimates the data without considering the labels. To address the problems, we present triple generative adversarial net (Triple-GAN), which consists of three players—a generator, a discriminator and a classifier. The generator and the classifier characterize the conditional distributions between images and labels, and the discriminator solely focuses on identifying fake image-label pairs. We design compatible utilities to ensure that the distributions characterized by the classifier and the generator both converge to the data distribution. Our results on various datasets demonstrate that Triple-GAN as a unified model can simultaneously (1) achieve the state-of-the-art classification results among deep generative models, and (2) disentangle the classes and styles of the input and transfer smoothly in the data space via interpolation in the latent space class-conditionally.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Conditional Generative Adversarial Nets

Generative Adversarial Nets [8] were recently introduced as a novel way to train generative models. In this work we introduce the conditional version of generative adversarial nets, which can be constructed by simply feeding the data, y, we wish to condition on to both the generator and discriminator. We show that this model can generate MNIST digits conditioned on class labels. We also illustr...

متن کامل

Bitewing Radiography Semantic Segmentation Base on Conditional Generative Adversarial Nets

Bitewing Radiography Semantic Segmentation Base on Conditional Generative Adversarial Nets JiangYun;TanNing;ZhangHai;PengTingting 【Abstract】 Currently, Segmentation of bitewing radiograpy images is a very challenging task. The focus of the study is to segment it into caries, enamel, dentin, pulp, crowns, restoration and root canal treatments. The main method of semantic segmentation of bitewing...

متن کامل

Temporal Generative Adversarial Nets

In this paper, we propose a generative model, Temporal Generative Adversarial Nets (TGAN), which can learn a semantic representation of unlabeled videos, and is capable of generating videos. Unlike existing Generative Adversarial Nets (GAN)-based methods that generate videos with a single generator consisting of 3D deconvolutional layers, our model exploits two different types of generators: a ...

متن کامل

Generative Adversarial Nets with Labeled Data by Activation Maximization

In this paper, we study the impact and role of multi-class labels on adversarial training for generative adversarial nets (GANs). Our derivation of the gradient shows that the current GAN model with labeled data still results in undesirable properties due to the overlay of the gradients from multiple classes. We thus argue that a better gradient should follow the intensity and direction that ma...

متن کامل

Conditional generative adversarial nets for convolutional face generation

We apply an extension of generative adversarial networks (GANs) [8] to a conditional setting. In the GAN framework, a “generator” network is tasked with fooling a “discriminator” network into believing that its own samples are real data. We add the capability for each network to condition on some arbitrary external data which describes the image being generated or discriminated. By varying the ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017