LossFunction
reduction
specifies the reduction to apply to the output:
none
: do nothing and return the vector of losses.sum
: return the sum of the losses.mean
: return the mean of the losses.
These functions must have both the input y
and t
as vectors.
LearningHorse.LossFunction.mse
— Functionmse(y, t; reduction="mean")
Mean Square Error. This is the expression:
\[MSE(y, t) = \frac{\sum_{i=1}^{n} (t_{i}-y_{i})^{2}}{n}\]
LearningHorse.LossFunction.cee
— Functioncee(y, t; reduction="mean")
Cross Entropy Error. This is the expression:
\[CEE(y, t) = \frac{\sum_{i=1}^{n} t\ln y}{n}\]
LearningHorse.LossFunction.mae
— Functionmae(y, t)
Mean Absolute Error. This is the expression:
\[MAE(y, t) = \frac{\sum_{i=1}^{n} |t_{i}-y_{i}|}{n}\]
LearningHorse.LossFunction.huber
— Functionhuber(y, t; δ=1, reduction="mean")
Huber-Loss. If δ
is large, it will be a function like mse
, and if it is small, it will be a function like mae
. This is the expression:
\[a = |t_{i}-y_{i}| \\ Huber(y, t) = \frac{1}{n} \sum_{i=1}^{n} \left\{ \begin{array}{ll} \frac{1}{2}a^{2} & (a \leq \delta) \\ \delta(a-\frac{1}{2}\delta) & (a \gt \delta) \end{array} \right.\]
LearningHorse.LossFunction.logcosh_loss
— Functionlogcosh_loss(y, t; reduction="mean")
Log Cosh. Basically, it's mae
, but if the loss is small, it will be close to mse
. This is the expression:
\[Logcosh(y, t) = \frac{\sum_{i=1}^{n} \log(\cosh(t_{i}-y_{i}))}{n}\]
LearningHorse.LossFunction.poisson
— FunctionPoisson(y, t; reduction="mean")
Poisson Loss, Distribution of predicted value and loss of Poisson distribution. This is the expression:
\[Poisson(y, t) = \frac{\sum_{i=1}^{n} y_{i}-t_{i} \ln y}{n}\]
LearningHorse.LossFunction.hinge
— Functionhinge(y, t; reduction="mean")
Hinge Loss, for SVM. This is the expression:
\[Hinge(y, t) = \frac{\sum_{i=1}^{n} \max(1-y_{i}t_{i}, 0)}{n}\]
LearningHorse.LossFunction.smooth_hinge
— Functionsmooth_hinge(y, t; reduction="mean")
Smoothing Hinge Loss. This is the expression:
\[smoothHinge(y, t) = \frac{1}{n} \sum_{i=1}^{n} \left\{ \begin{array}{ll} 0 & (t_{i}y_{i} \geq 1) \\ \frac{1}{2}(1-t_{i}y_{i})^{2} & (0 \lt t_{i}y_{i} \lt 1) \\ \frac{1}{2} - t_{i}y_{i} & (t_{i}y_{i} \leq 0) \end{array} \right.\]