Solved – cross entropy loss max value

cross entropyextreme valueloss-functions

The cross entropy loss function for multiclass can be computed as:
$$-\sum\limits_{i=1}^N y_i log \hat{y}_i$$
where $y_i$ is a class and $\hat{y}_i$ the estimated probability. The minimum value is $0$ (when the estimated probability is $1$ for the correct class).
Has this function a maximum value? I think when the estimated probability is $0$ for the correct class, but what probabilities should the other classes to have?

Best Answer

It doesn't have a maximum value. When $y_i=1$, and $\hat{y}_i=0$, the loss is infinite. Or at least, we can say that as predicted probability for the true class goes towards $0$, the loss approaches towards infinity. Since the range of $H(y,\hat{y})$ is $\mathbb{R}_{\geq 0}$, and $\infty \notin \mathbb{R}_{\geq 0}$, we cannot say the function has a maximum (i.e. the maximum value should have been in its range). Probability values of other classes doesn't matter, because the corresponding $y_j$ are $0$.

Related Question