theanets.activations.Prelu

class theanets.activations.Prelu(*args, **kwargs)

Parametric rectified linear activation with learnable leak rate.

This activation is characterized by two linear pieces joined at the origin. For negative inputs, the unit response is a linear function of the input with slope \(r\) (the “leak rate”). For positive inputs, the unit response is the identity function:

\[\begin{split}f(x) = \left\{ \begin{eqnarray*} rx &\qquad& \mbox{if } x < 0 \\ x &\qquad& \mbox{otherwise} \end{eqnarray*} \right.\end{split}\]

This activation allocates a separate leak rate for each unit in its layer.

References

K He, X Zhang, S Ren, J Sun (2015), “Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification” http://arxiv.org/abs/1502.01852

__init__(*args, **kwargs)

Methods

__init__(*args, **kwargs)