gan - provides a inherited network class for a gan network.

The file yann.special.gan.py contains the definition for gan-style network. Any GAN network can be built using this class. It is basically an inherited network from the yann.network file.

Support for the implementation from

Goodfellow, Ian, Jean Pouget-Abadie, Mehdi Mirza, Bing Xu, David Warde-Farley, Sherjil Ozair, Aaron Courville, and Yoshua Bengio. “Generative adversarial nets.” In Advances in Neural Information Processing Systems, pp. 2672-2680. 2014.

Todo

There seems to be something wrong with the fine-tuning update. Code crashes after a call to _new_era. This needs debugging and fixing.

class yann.special.gan.gan(verbose=2, **kwargs)[source]

This class is inherited from the network class and has its own methods modified in support of gan networks.

Todo

Sumith Chintala says that its better to seperate generator and dataset when training discriminator. Do that.

in __init__ kwargs = kwargs is not a good option Check its working.

Parameters:as the network class (Same) –
cook(objective_layers, discriminator_layers, generator_layers, game_layers, softmax_layer=None, classifier_layers=None, optimizer_params=None, verbose=2, **kwargs)[source]

This function builds the backprop network, and makes the trainer, tester and validator theano functions. The trainer builds the trainers for a particular objective layer and optimizer.

Parameters:
  • optimizer_params – Supply optimizer_params.
  • datastream – Supply which datastream to use. Default is the last datastream created.
  • visualizer – Supply a visualizer to cook with.
  • objective_layers – Supply a tuple of layer ids of layers that have the objective functions (classification, discriminator, generator)
  • classifier – supply the classifier layer of the discriminator.
  • discriminator – supply the discriminator layer of the data stream.
  • generator – supply the last generator layer.
  • generator_layers – list or tuple of all generator layers
  • discriminator_layers – list or tuple of all discriminator layers
  • classifier_layers – list or tuple of all classifier layers
  • game_layers – list or tuple of two layers. The first is D(G(z)) and the second is D(x)
  • verbose – Similar to the rest of the toolbox.
cook_discriminator(optimizer_params, verbose=2)[source]

This method cooks the real optimizer.

Parameters:verbose – as always
cook_generator(optimizer_params, verbose=2)[source]

This method cooks the fake optimizer.

Parameters:verbose – as always
cook_softmax_optimizer(optimizer_params, verbose=2)[source]

This method cooks the softmax optimizer.

Parameters:verbose – as always
initialize_train(verbose=2)[source]

Internal function that creates a train methods for the GAN network

Parameters:verbose – as always
print_status(epoch, verbose=2)[source]

This function prints the costs of the current epoch, learning rate and momentum of the network at the moment.

Todo

This needs to to go to visualizer.

Parameters:
  • verbose – Just as always.
  • epoch – Which epoch are we at ?
train(verbose, **kwargs)[source]

Training function of the network. Calling this will begin training.

Parameters:
  • epochs(num_epochs for each learning rate... ) to train Default is (20, 20)
  • validate_after_epochs – 1, after how many epochs do you want to validate ?
  • show_progress – default is True, will display a clean progressbar. If verbose is 3 or more - False
  • early_terminateTrue will allow early termination.
  • k – how many discriminator updates for every generator update.
  • learning_rates – (annealing_rate, learning_rates ... ) length must be one more than epochs Default is (0.05, 0.01, 0.001)
  • save_after_epochs – 1, Save network after that many epochs of training.
  • pre_train_discriminator – If you want to pre-train the discriminator to make it stay ahead of the generator for making predictions. This will only train the softmax layer loss and not the fake or real loss.
validate(epoch=0, training_accuracy=False, show_progress=False, verbose=2)[source]

Method is use to run validation. It will also load the validation dataset.

Parameters:
  • verbose – Just as always
  • show_progress – Display progressbar ?
  • training_accuracy – Do you want to print accuracy on the training set as well ?