The expected value of the estimator

statistics

I would like to clear some doubts about estimators in statistics. By definition, an estimator $\hat{\theta}$ is a function that 'estimates' the value of the parameter $\theta$, itself being a random variable. An estimator is unbiased if it's expected value is $\theta$.

What I'm confused about is, what exactly does it mean, the expected value of the estimator? Is it the usual

$$\sum_i \hat{\theta_i} \cdot P\left(\hat{\theta}(X_i) = \hat{\theta_i}\right), \int_{-\infty}^{\infty} \hat{\theta} \cdot P\left(\hat{\theta}(X_i) \leq \hat{\theta}\right)$$
?

Or, is it simply taking the expected value of the parent distribution? As in,

$\mathbb{E}(\hat{\theta}) = \int \hat{\theta}\cdot f(x) dx $?

Best Answer

Any estimator is a function of (only of) the data. thus tipically

$$\hat{\theta}=t(\mathbf{x})$$

and thus, by definition,

$$\mathbb{E}[\hat{\theta}]=\int_T tf(t)dt$$


Example...

given a simple random sample $X_1,\dots,X_n$ from a Uniform $U(0;\theta)$ the optimal estimator of $\theta$ is $t(\mathbf{x})=max(\mathbf{x})$

Its distribution is (easy to prove)

$$f_T(t)=\frac{n t^{n-1}}{\theta^n}$$

thus

$$\mathbb{E}[T]=\int_0^{\theta}\frac{nt^n}{\theta^n}dt=\frac{n}{n+1}\theta$$

Related Question