fully_connected
- fully connected layer classes¶
The file yann.layers.fully_connected.py
contains the definition for the fc layers.
-
class
yann.layers.fully_connected.
dot_product_layer
(input, num_neurons, input_shape, id, rng=None, input_params=None, borrow=True, activation='relu', batch_norm=True, verbose=2)[source]¶ This class is the typical neural hidden layer and batch normalization layer. It is called by the
add_layer
method in network class.Parameters: - input – An input
theano.tensor
variable. Eventheano.shared
will work as long as they are in the following shapemini_batch_size, height, width, channels
- verbose – similar to the rest of the toolbox.
- num_neurons – number of neurons in the layer
- input_shape –
(mini_batch_size, input_size)
theano shared - batch_norm – If provided will be used, default is
False
. - rng – typically
numpy.random
. - borrow –
theano
borrow, typicallTrue
. - activation – String, takes options that are listed in
activations
Needed for layers that use activations. Some activations also take support parameters, for instancemaxout
takes maxout type and size,softmax
takes an option temperature. Refer to the moduleactivations
to know more. - input_params – Supply params or initializations from a pre-trained system.
Notes
Use
dot_product_layer.output
anddot_product_layer.output_shape
from this class.L1
andL2
are also public and can also can be used for regularization. The class also has in publicw
,b
andalpha
which are also a list inparams
, another property of this class.- input – An input
-
class
yann.layers.fully_connected.
dropout_dot_product_layer
(input, num_neurons, input_shape, id, dropout_rate=0.5, rng=None, input_params=None, borrow=True, activation='relu', batch_norm=True, verbose=2)[source]¶ This class is the typical dropout neural hidden layer and batch normalization layer. Called by the
add_layer
method in network class.Parameters: - input – An input
theano.tensor
variable. Eventheano.shared
will work as long as they are in the following shapemini_batch_size, height, width, channels
- verbose – similar to the rest of the toolbox.
- num_neurons – number of neurons in the layer
- input_shape –
(mini_batch_size, input_size)
- batch_norm – If provided will be used, default is
False
. - rng – typically
numpy.random
. - borrow –
theano
borrow, typicallTrue
. - activation – String, takes options that are listed in
activations
Needed for layers that use activations. Some activations also take support parameters, for instancemaxout
takes maxout type and size,softmax
takes an option temperature. Refer to the moduleactivations
to know more. - input_params – Supply params or initializations from a pre-trained system.
- dropout_rate –
0.5
is the default.
Notes
Use
dropout_dot_product_layer.output
anddropout_dot_product_layer.output_shape
from this class.L1
andL2
are also public and can also can be used for regularization. The class also has in publicw
,b
andalpha
which are also a list inparams
, another property of this class.- input – An input