Generative Adversarial Networks.

By virture of being here, it is assumed that you have gone through the Quick Start.

Todo

Code is done, but text needs to be written in. This code/tutorial will also explain how the network class is setup because to implement a GAN, we need to inherit the network class out and re-write some of the methods.

The full code for this tutorial with additional commentary can be found in the file pantry.tutorials.gan.py. If you have toolbox cloned or downloaded or just the tutorials downloaded, Run the code as,

Referenced from

Goodfellow, Ian, Jean Pouget-Abadie, Mehdi Mirza, Bing Xu, David Warde-Farley, Sherjil Ozair, Aaron Courville, and Yoshua Bengio. “Generative adversarial nets.” In Advances in Neural Information

Processing Systems, pp. 2672-2680. 2014.

Notes

This file contains several GAN implementations:

  1. Shallow GAN setup for MNIST
  2. Shallow Wasserstein GAN setup for MNIST*
  3. Deep GAN (Ian Goodfellow’s original implementation) setup for MNIST
  4. DCGAN (Chintala et al.) setup for CIFAR 10
  5. LS - DCGAN setup for CIFAR 10

Todos:

  • Convert the DCGANs for CELEBA.
  • WGAN doesn’t work properly because of clipping.
  • Check that DCGANs strides are properly setup.
pantry.tutorials.gan.deep_deconvolutional_gan(dataset, regularize=True, batch_norm=True, dropout_rate=0.5, verbose=1)[source]

This function is a demo example of a generative adversarial network. This is an example code. You should study this code rather than merely run it. This method uses a few deconvolutional layers. This method is setup to produce images of size 32X32.

Parameters:
  • dataset – Supply a dataset.
  • regularizeTrue (default) supplied to layer arguments
  • batch_normTrue (default) supplied to layer arguments
  • dropout_rateNone (default) supplied to layer arguments
  • verbose – Similar to the rest of the dataset.
Returns:

A Network object.

Return type:

net

Notes

This method is setup for Cifar 10.

pantry.tutorials.gan.deep_deconvolutional_lsgan(dataset, regularize=True, batch_norm=True, dropout_rate=0.5, verbose=1)[source]

This function is a demo example of a generative adversarial network. This is an example code. You should study this code rather than merely run it. This method uses a few deconvolutional layers as was used in the DCGAN paper. This method is setup to produce images of size 32X32.

Parameters:
  • dataset – Supply a dataset.
  • regularizeTrue (default) supplied to layer arguments
  • batch_normTrue (default) supplied to layer arguments
  • dropout_rateNone (default) supplied to layer arguments
  • verbose – Similar to the rest of the dataset.
Returns:

A Network object.

Return type:

net

Notes

This method is setupfor SVHN / CIFAR10. This is an implementation of th least squares GAN with a = 0, b = 1 and c= 1 (equation 9) [1] Least Squares Generative Adversarial Networks, Xudong Mao, Qing Li, Haoran Xie, Raymond Y.K. Lau, Zhen Wang

pantry.tutorials.gan.deep_gan_mnist(dataset, verbose=1)[source]

This function is a demo example of a generative adversarial network. This is an example code. You should study this code rather than merely run it.

Parameters:
  • dataset – Supply a dataset.
  • verbose – Similar to the rest of the dataset.
Returns:

A Network object.

Return type:

net

Notes

This network here mimics Ian Goodfellow’s original code and implementation for MNIST adapted from his source code: https://github.com/goodfeli/adversarial/blob/master/mnist.yaml .It might not be a perfect replicaiton, but I tried as best as I could.

This method is setup for MNIST

pantry.tutorials.gan.shallow_gan_mnist(dataset=None, verbose=1)[source]

This function is a demo example of a generative adversarial network. This is an example code. You should study this code rather than merely run it.

Parameters:
  • dataset – Supply a dataset.
  • verbose – Similar to the rest of the dataset.

Notes

This method is setup for MNIST.

pantry.tutorials.gan.shallow_wgan_mnist(dataset=None, verbose=1)[source]

This function is a demo example of a Wasserstein generative adversarial network. This is an example code. You should study this code rather than merely run it.

Parameters:
  • dataset – Supply a dataset.
  • verbose – Similar to the rest of the dataset.

Notes

This method is setup for MNIST. Everything in this code is the same as the shallow GAN class except for the loss functions.

Todo

This is not verified. There is some trouble in weight clipping.