fully_connected - fully connected layer classes

The file yann.layers.fully_connected.py contains the definition for the fc layers.

class yann.layers.fully_connected.dot_product_layer(input, num_neurons, input_shape, id, rng=None, input_params=None, borrow=True, activation='relu', batch_norm=True, verbose=2)[source]

This class is the typical neural hidden layer and batch normalization layer. It is called by the add_layer method in network class.

Parameters:
  • input – An input theano.tensor variable. Even theano.shared will work as long as they are in the following shape mini_batch_size, height, width, channels
  • verbose – similar to the rest of the toolbox.
  • num_neurons – number of neurons in the layer
  • input_shape(mini_batch_size, input_size) theano shared
  • batch_norm – If provided will be used, default is False.
  • rng – typically numpy.random.
  • borrowtheano borrow, typicall True.
  • activation – String, takes options that are listed in activations Needed for layers that use activations. Some activations also take support parameters, for instance maxout takes maxout type and size, softmax takes an option temperature. Refer to the module activations to know more.
  • input_params – Supply params or initializations from a pre-trained system.

Notes

Use dot_product_layer.output and dot_product_layer.output_shape from this class. L1 and L2 are also public and can also can be used for regularization. The class also has in public w, b and alpha which are also a list in params, another property of this class.

L2 = None[source]

Ioffe, Sergey, and Christian Szegedy. “Batch normalization – Accelerating deep network training by reducing internal covariate shift.” arXiv preprint arXiv:1502.03167 (2015).

class yann.layers.fully_connected.dropout_dot_product_layer(input, num_neurons, input_shape, id, dropout_rate=0.5, rng=None, input_params=None, borrow=True, activation='relu', batch_norm=True, verbose=2)[source]

This class is the typical dropout neural hidden layer and batch normalization layer. Called by the add_layer method in network class.

Parameters:
  • input – An input theano.tensor variable. Even theano.shared will work as long as they are in the following shape mini_batch_size, height, width, channels
  • verbose – similar to the rest of the toolbox.
  • num_neurons – number of neurons in the layer
  • input_shape(mini_batch_size, input_size)
  • batch_norm – If provided will be used, default is False.
  • rng – typically numpy.random.
  • borrowtheano borrow, typicall True.
  • activation – String, takes options that are listed in activations Needed for layers that use activations. Some activations also take support parameters, for instance maxout takes maxout type and size, softmax takes an option temperature. Refer to the module activations to know more.
  • input_params – Supply params or initializations from a pre-trained system.
  • dropout_rate0.5 is the default.

Notes

Use dropout_dot_product_layer.output and dropout_dot_product_layer.output_shape from this class. L1 and L2 are also public and can also can be used for regularization. The class also has in public w, b and alpha which are also a list in params, another property of this class.