activations
 Definitions for all activations functions.¶
The file yann.core.activations.py
contains the definition for all the activation
functions available.
You can import all these functions and supply the fuctions as arguments to functions that use
activation
variable as an input. Refer to the mnist example in the modelzoo for how to do
this. It contains various activations as defined below:

yann.core.activations.
Abs
(x)[source]¶ Absolute value Units.
Applies pointwise absolute value to the input supplied.
Parameters: x – could be a theano.tensor
or atheano.shared
ornumpy
arrays orpython lists
.Returns: returns a absolute output of the same shape as the input. Return type: same as input

yann.core.activations.
Elu
(x, alpha=1)[source]¶ Exponential Linear Units.
Applies pointwise ela to the input supplied.
alpha
is defualt to0
. Supplying a value toalpha
would make this a leay Elu.Notes
 Reference :Clevert, DjorkArne, Thomas Unterthiner, and Sepp Hochreiter. “Fast and accurate
 deep network learning by exponential linear units (elus).” arXiv preprint arXiv:1511.07289 (2015).
Parameters:  x – could be a
theano.tensor
or atheano.shared
ornumpy
arrays orpython lists
.  alpha – should be a
float
. Default is1
.
Returns: returns a pointwise rectified output.
Return type: same as input

yann.core.activations.
Maxout
(x, maxout_size, input_size, type='maxout', dimension=1)[source]¶ Function performs the maxout activation. You can import all these functions and supply the fuctions as arguments to functions that use
activation
variable as an input. Refer to the mnist example in the modelzoo for how to do this.Parameters:  x – could be a
theano.tensor
or atheano.shared
ornumpy
arrays orpython lists
. Size of the argument must strictly be windowed runnable throughstride
. Second dimension must be the channels to maxout from  maxout_size – is the size of the window to stride through
 input_size – is number of nodes in the input
 dimension – If
1
perform MLP layer maxout, input must be two dimensional. If2
perform CNN layer maxout, input must be four dimensional.  type – If
maxout
perform, [1] Ifmeanout
ormixedout
perform, meanout or mixed out respectively from [2]
[1] Yu, Dingjun, et al. “Mixed Pooling for Convolutional Neural Networks.” Rough Sets and Knowledge Technology. Springer International Publishing, 2014. 364375. [2] Ian Goodfellow et al. ” Maxout Networks ” on arXiv. (jmlr). Returns: theano.tensor4
output, Output that could be provided as output to the next layer or to other convolutional layer options. the size of the output depends on border mode and subsample operation performed.
tuple
, Number of feature maps after maxout is applied
Return type: theano.tensor4
 x – could be a

yann.core.activations.
ReLU
(x, alpha=0)[source]¶ Rectified Linear Units.
Applies pointwise rectification to the input supplied.
alpha
is defualt to0
. Supplying a value toalpha
would make this a leay ReLU.Notes
 Reference: Nair, Vinod, and Geoffrey E. Hinton. “Rectified linear units improve restricted
 boltzmann machines.” Proceedings of the 27th International Conference on Machine Learning (ICML10). 2010.
Parameters:  x – could be a
theano.tensor
or atheano.shared
ornumpy
arrays orpython lists
.  alpha – should be a
float
.
Returns: returns a pointwise rectified output.
Return type: same as input

yann.core.activations.
Sigmoid
(x)[source]¶ Sigmoid Units.
Applies pointwise sigmoid to the input supplied.
Parameters: x – could be a theano.tensor
or atheano.shared
ornumpy
arrays orpython lists
.Returns: returns a pointwise sigmoid output of the same shape as the input. Return type: same as input

yann.core.activations.
Softmax
(x, temp=1)[source]¶ Softmax Units.
Applies rowwise softmax to the input supplied.
Parameters:  x – could be a
theano.tensor
or atheano.shared
ornumpy
arrays orpython lists
.  temp – temperature of type
float
. Mainly used during distillation, normal softmax preferT=1
.
Notes
Refer [3] for details.
[3] Hinton, Geoffrey, Oriol Vinyals, and Jeff Dean. “Distilling the knowledge in a neural network.” arXiv preprint arXiv:1503.02531 (2015). Returns: returns a rowwise softmax output of the same shape as the input. Return type: same as input  x – could be a