[Math] Least squares estimator of mu

regressionstatistics

The question is:

Assuming that $y_i = \mu + \epsilon_i $,$i = 1,\ldots,n$ with independent and identically distributed errors $\epsilon_i$ such that $E[\epsilon_i] = 0$ and $Var[\epsilon_i] = \sigma^2$, find the least squares estimator of $\mu$. Find its variance.

I'm not sure how to go about doing this.

I know that the least squares bit means that I minimize the sum of the errors, and so I would have to use the formula:

$$\sum_i (y_i – \mu)^2$$

and then differentiate (wrt to $\mu$?) and then let it equal 0.

Is that correct?

Once I've done this, I would I calculate its $E[\mu]$, because I don't have any definition for $\mu$. Or is $\mu = \beta_0 + \beta_1 \cdot x_i$? If it is, then isn't the estimator the same?

Best Answer

Some hints, but not quite the full answer:

There is a difference between a parameter $\mu$ and an estimator of that parameter. So if we call the estimator $\hat{\mu}$ then you want to minimise $$\sum_i (y_i - \hat{\mu})^2$$ which is $$\sum_i y_i^2 - \sum_i 2 y_i \hat{\mu} +\sum_i \hat{\mu} ^2$$ and (as you suggest) this will be when its derivative with respect to $\hat{\mu}$ is zero. Strictly speaking you should check this is a minimum, but since the derivative is monotone increasing that is obvious.

Since $y_i = \mu + \epsilon_i$, you know $E[y_i] = E[\mu] + E[\epsilon_i]$, so it will be easy to find $E[\hat{\mu}]$.

As for $Var(\hat{\mu})$, you again have to multiply out a square, looking at $$E\left[\left(\hat{\mu}-E[\hat{\mu}]\right)^2\right].$$ You might want to use the fact that $y_i^2 = \mu^2 + 2 \mu \epsilon_i +\epsilon_i^2$ implies $E[y_i^2] = \mu^2 + \sigma^2$.