Solved – Mean squared error definition

msenonparametricregressionsmoothing

I'm currently working through (part of) a textbook on non-parametric regression techniques. Regarding the choice of smoothing parameter the book starts out explaining the MSE which is defined as:

$MSE(\hat f(x)) = E[(\hat f(x)-f(x))^{2}]$

Would this really be the MSE? The book also mentions the mean squared error of prediction (separately) but doesn't give a definition other than to explain that MSEP is for new values $(x^{*},y^{*})$.

I'm a bit confused as I've seen the term MSEP applied to the definition of MSE given by the book.

Thanks!

Best Answer

I might be telling you something you already know, but keep in mind that really

$\hat{f}(x)=\hat{f}(x,\{X_k\})$,

where $\{X_k\}$ is the set of sample points over which you build your estimate. For most non-parametric estimators, the $X_k$ are assumed independent, and the method is additive, so you can just look at the $MSE$ of $\hat{f}(x,X_k)$ and then take an average.

Then your formula is interpreted as

$MSE(\hat f(x)) = E[(\hat f(x)-f(x))^{2}]=\int_\Omega(\hat{f}(x,z)-f(x))^2f(z)dz$,

which yes, is the MSE error at the point $x$.

As for the $MSEP$, I'm not entirely sure what your question is, but there are surely various ways to predict this. If you want to know the error expected at $x^*$, then I guess it probably does line up with the $MSE$, however, you might want to know for example the prediction error for a random $X^*$, in which case you might assume it is drawn from a $f(x)$ distribution, then you would want the $MISE$.

Hope that helps clarify something.

Related Question