theanets.regularizers.Regularizer¶

class
theanets.regularizers.
Regularizer
(pattern=None, weight=0.0)¶ A regularizer for a neural network model.
Subclasses of this base usually either provide an implementation of the
modify_graph()
method, or an implementation of theloss()
method (but (almost?) never both).Parameters: pattern : str
A shellstyle glob pattern describing the parameters or outputs that this regularizer ought to apply to.
weight : float
A scalar weight that indicates the “strength” of this regularizer in the overall loss for a model.
Attributes
pattern (str) A shellstyle glob pattern describing the parameters or outputs that this regularizer ought to apply to. weight (float) A scalar weight that indicates the “strength” of this regularizer in the overall loss for a model. 
__init__
(pattern=None, weight=0.0)¶
Methods
__init__
([pattern, weight])log
()Log some diagnostic info about this regularizer. loss
(layers, outputs)Compute a scalar term to add to the loss function for a model. modify_graph
(outputs)Modify the outputs of a particular layer in the computation graph. 
log
()¶ Log some diagnostic info about this regularizer.

loss
(layers, outputs)¶ Compute a scalar term to add to the loss function for a model.
Parameters: layers : list of
theanets.layers.Layer
A list of the layers in the model being regularized.
outputs : dict of Theano expressions
A dictionary mapping string expression names to their corresponding Theano expressions in the computation graph. This dictionary contains the fullyscoped name of every layer output in the graph.

modify_graph
(outputs)¶ Modify the outputs of a particular layer in the computation graph.
Parameters: outputs : dict of Theano expressions
A map from string output names to the corresponding Theano expression. This dictionary contains the fullyscoped name of all outputs from a single layer in the computation graph.
This map is mutable, so any changes that the regularizer makes will be retained when the caller regains control.
Notes
This method is applied during graphconstruction time to change the behavior of one or more layer outputs. For example, the
BernoulliDropout
class replaces matching outputs with an expression containing “masked” outputs, where some elements are randomly set to zero each time the expression is evaluated.Any regularizer that needs to modify the structure of the computation graph should implement this method.
