theanets.layers.recurrent.Bidirectional

class theanets.layers.recurrent.Bidirectional(worker='rnn', **kwargs)

A bidirectional recurrent layer runs worker models forward and backward.

The outputs of the forward and backward passes are combined using an affine transformation into the overall output for the layer.

For an example specification of a bidirectional recurrent network, see A. Graves, N. Jaitly, and A. Mohamed, “Hybrid Speech Recognition with Deep Bidirectional LSTM,” 2013. http://www.cs.toronto.edu/~graves/asru_2013.pdf

Parameters:

worker : str, optional

This string specifies the type of worker layer to use for the forward and backward processing. This parameter defaults to ‘rnn’ (i.e., vanilla recurrent network layer), but can be given as any string that specifies a recurrent layer type.

__init__(worker='rnn', **kwargs)

Methods

__init__([worker])
to_spec() Create a specification dictionary for this layer.
transform(inputs) Transform the inputs for this layer into an output for the layer.

Attributes

input_size For networks with one input, get the input size.
num_params Total number of learnable parameters in this layer.
params A list of all learnable parameters in this layer.
num_params

Total number of learnable parameters in this layer.

params

A list of all learnable parameters in this layer.

to_spec()

Create a specification dictionary for this layer.

Returns:

spec : dict

A dictionary specifying the configuration of this layer.

transform(inputs)

Transform the inputs for this layer into an output for the layer.

Parameters:

inputs : dict of theano expressions

Symbolic inputs to this layer, given as a dictionary mapping string names to Theano expressions. See base.Layer.connect().

Returns:

outputs : dict of theano expressions

Theano expressions representing the output from the layer. This layer type produces an “out” output that concatenates the outputs from its underlying workers. If present, it also concatenates the “pre” and “cell” outputs from the underlying workers. Finally, it passes along the individual outputs from its workers using “fw” and “bw” prefixes for forward and backward directions.

updates : list of update pairs

A list of state updates to apply inside a theano function.