# theanets.losses.GaussianLogLikelihood¶

class theanets.losses.GaussianLogLikelihood(mean_name='mean', covar_name='covar', covar_eps=0.001, **kwargs)

Gaussian Log Likelihood (GLL) loss function.

Parameters: mean_name : str Name of the network graph output to use for the mean of the Gaussian distribution. covar_name : str Name of the network graph output to use for the diagonal covariance of the Gaussian distribution.

Notes

This loss computes the negative log-likelihood of the observed target data $$y$$ under a Gaussian distribution, where the neural network computes the mean $$\mu$$ and the diagonal of the covariance $$\Sigma$$ as a function of its input $$x$$. The loss is given by:

$\mathcal{L}(x, y) = -\log p(y) = -\log p\left(y|\mu(x),\Sigma(x)\right)$

where

$p(y) = p(y|\mu,\Sigma) = \frac{1}{(2\pi)^{n/2}|\Sigma|^{1/2}} \exp\left\{-\frac{1}{2}(y-\mu)^\top\Sigma^{-1}(y-\mu) \right\}$

is the Gaussian density function.

The log density $$\log p(y)$$ can be parameterized more conveniently [Gu08] as:

$\log p(y|\nu,\Lambda) = a + \eta^\top y - \frac{1}{2} y^\top \Lambda y$

where $$\Lambda = \Sigma^{-1}$$ is the precision, $$\eta = \Lambda\mu$$ is the covariance-skewed mean, and $$a=-\frac{1}{2}\left(n\log 2\pi-\log|\Lambda|+\eta^\top\Lambda\eta\right)$$ contains all constant terms. (These terms are all computed as a function of the input, $$x$$.)

This implementation of the Gaussian log-likelihood loss approximates $$\Sigma$$ using only its diagonal. This makes the precision easy to compute because

$\Sigma^{-1} = \Lambda = \mbox{diag}(\frac{1}{\sigma_1}, \dots, \frac{1}{\sigma_n})$

is just the matrix containing the multiplicative inverse of the diagonal covariance values. Similarly, the log-determinant of the precision is just the sum of the logs of the diagonal terms:

$\log|\Lambda|=\sum_{i=1}^n\log\lambda_i=-\sum_{i=1}^n\log\sigma_i.$

The log-likelihood is computed separately for each input-output pair in a batch, and the overall likelihood is the mean of these individual values.

Weighted targets unfortunately do not work with this loss at the moment.

References

 [Gu08] (1, 2) Multivariate Gaussian Distribution. https://www.cs.cmu.edu/~epxing/Class/10701-08s/recitation/gaussian.pdf
__init__(mean_name='mean', covar_name='covar', covar_eps=0.001, **kwargs)

Methods

 __init__([mean_name, covar_name, covar_eps]) log() Log some diagnostic info about this loss.

Attributes

 variables A list of Theano variables used in this loss.
log()