theanets.layers.recurrent.RNN

class theanets.layers.recurrent.RNN(**kwargs)

Standard recurrent network layer.

There are many different styles of recurrent network layers, but the one implemented here is known as an Elman layer or an SRN (Simple Recurrent Network) – the output from the layer at the previous time step is incorporated into the input of the layer at the current time step.

__init__(**kwargs)

Methods

__init__(**kwargs)
add_weights(name, nin, nout[, mean, std, ...]) Helper method to create a new weight matrix.
initial_state(name, batch_size) Return an array of suitable for representing initial state.
setup() Set up the parameters and initial values for this layer.
transform(inputs) Transform the inputs for this layer into an output for the layer.

Attributes

input_size For networks with one input, get the input size.
num_params Total number of learnable parameters in this layer.
params A list of all parameters in this layer.
setup()

Set up the parameters and initial values for this layer.

transform(inputs)

Transform the inputs for this layer into an output for the layer.

Parameters:

inputs : dict of theano expressions

Symbolic inputs to this layer, given as a dictionary mapping string names to Theano expressions. See base.Layer.connect().

Returns:

outputs : dict of theano expressions

A map from string output names to Theano expressions for the outputs from this layer. This layer type generates a “pre” output that gives the unit activity before applying the layer’s activation function, and an “out” output that gives the post-activation output.

updates : list of update pairs

A sequence of updates to apply inside a theano function.