Solved – How does the logarithmic scoring rule work given that it’s undefined for zero

logarithmscoring-rules

[I'm not a mathematician, so please forgive any misuse of terminology]

One way of understanding scoring rules is that they measure the 'distance' between the truth value of a statement, and the probability we assign to that statement. So, if 'it's raining' is true, that statement gets assigned 1. We find the difference between 1 and the probability we've assigned it. Let's suppose the difference is 0.3. We then plug that value into the scoring rule. If it's the Brier score, we'd square 0.3 to get 0.09. This is all well and good, but what happens if a) we'd assigned probability 1 to the true statement and b) we were using the logarithmic scoring rule. Since 1-1=0, and 0 is undefined for natural logarithms … well, what does happen?

Relatedly, are there other scoring rules which have this feature? The only other scoring rules I'm familiar with are Brier and Spherical distance, neither of which do. But presumably there are other scoring rules besides these? (i.e., which aren't just linear transformations of them).

Best Answer

The logarithmic scoring rule is the most sensitive one. To achieve that it has to have the property that a 'wrong' prediction that is certain causes an infinite error. To avoid that, don't allow your predictions to be as extreme as probabilities of 0 or 1. Many people choose the Brier score to avoid this problem.

Related Question