batch_norm - Batch normalization layer classes

The file yann.layers.batch_norm.py contains the definition for the batch norm layers. Batch norm can by default be applied to convolution and fully connected layers by sullying an argument batch_norm = True, in the layer arguments. But this in-built method applies batch norm prior to layer activation. Some architectures including ResNet involves batch norms after the activations of the layer. Therefore there is a need for an independent batch norm layer that simply applies batch norm for some outputs. The layers in this module can do that.

There are four classes in this file. Two for one-dimensions and two for two-dimnensions.

Todo

  • Need to the deconvolutional-unpooling layer.
  • Something is still not good about the convolutional batch norm layer.
class yann.layers.batch_norm.batch_norm_layer_1d(input, input_shape, id, rng=None, borrow=True, input_params=None, verbose=2)[source]

This class is the typical 1D batchnorm layer. It is called by the add_layer method in network class.

Parameters:
  • input – An input theano.tensor variable. Even theano.shared will work as long as they are in the following shape mini_batch_size, height, width, channels
  • verbose – similar to the rest of the toolbox.
  • input_shape(mini_batch_size, channels, height, width)
  • rng – typically numpy.random.
  • borrowtheano borrow, typicall True.
  • input_params – Supply params or initializations from a pre-trained system.
class yann.layers.batch_norm.batch_norm_layer_2d(input, input_shape, id, rng=None, borrow=True, input_params=None, verbose=2)[source]

This class is the typical 2D batchnorm layer. It is called by the add_layer method in network class.

Parameters:
  • input – An input theano.tensor variable. Even theano.shared will work as long as they are in the following shape mini_batch_size, height, width, channels
  • verbose – similar to the rest of the toolbox.
  • input_shape(mini_batch_size, channels, height, width)
  • rng – typically numpy.random.
  • borrowtheano borrow, typicall True.
  • input_params – Supply params or initializations from a pre-trained system.
class yann.layers.batch_norm.dropout_batch_norm_layer_1d(input, input_shape, id, rng=None, borrow=True, input_params=None, dropout_rate=0, verbose=2)[source]

This class is the typical 1D batchnorm layer. It is called by the add_layer method in network class.

Parameters:
  • input – An input theano.tensor variable. Even theano.shared will work as long as they are in the following shape mini_batch_size, height, width, channels
  • verbose – similar to the rest of the toolbox.
  • input_shape(mini_batch_size, channels, height, width)
  • borrowtheano borrow, typicall True.
  • dropout_rate – bernoulli probabilty to dropoutby
  • input_params – Supply params or initializations from a pre-trained system.
class yann.layers.batch_norm.dropout_batch_norm_layer_2d(input, input_shape, id, rng=None, borrow=True, input_params=None, dropout_rate=0, verbose=2)[source]

This class is the typical 2D batchnorm layer. It is called by the add_layer method in network class.

Parameters:
  • input – An input theano.tensor variable. Even theano.shared will work as long as they are in the following shape mini_batch_size, height, width, channels
  • verbose – similar to the rest of the toolbox.
  • input_shape(mini_batch_size, channels, height, width)
  • borrowtheano borrow, typicall True.
  • dropout_rate – bernoulli probabilty to dropoutby
  • input_params – Supply params or initializations from a pre-trained system.