Loss Functions – Are Loss Functions Only Used to Evaluate Estimators?

estimatorsloss-functions

Given the likelihood $p(x;\theta)$, we want to construct an estimator $\hat\theta(X)$ that takes in the observation $x$ and returns an estimate of $\theta$. There are many different ways to evaluate our estimator $\hat\theta(X)$. For example, the bias of $\hat\theta(X)$ is
$$
\mathbb{E}_{p(x)}[\hat\theta(X) – \theta]
$$

and the mean squared-error (MSE) of $\hat\theta(X)$ is
$$
\mathbb{E}_{p(x)}[(\hat\theta(X) – \theta)^2]
$$

Both the bias and the MSE are functions of $\hat\theta$ and $\theta$, and it seems at first glance that they are examples of loss functions, since loss functions are also functions of $\hat\theta$ and $\theta$.

More generally, in the context of statistics, does this mean that loss functions are only used to evaluate estimators?

Best Answer

The way you defined them uses the concept of an estimator, so it would be hard to consider them separately. However, the terms such as utility and loss come from decision theory that is much broader than statistics or machine learning, because it concerns general decision making (choosing between estimators is a decision).

Related Question