Find unbiased estimator which is based on maximum likelihood for this problem

expected valueexponential distributionparameter estimationstatistics

Let $X_1, \ldots, X_n$ ($n \ge 2$) be a random sample from a distribution
having p.d.f.

$$f(x\mid \underline{\theta}) = \frac{1}{\sigma}e^{-(x-\mu)/\sigma}\;\;\; \text{if }
x>\mu \text{ and } 0 \text{ otherwise} \tag{1}$$

where $\underline{\theta} = (\mu, \sigma) \in \mathbb{R} \times
\mathbb{R}^+$
.

Let $g(\underline{\theta}) = \mu$. Find an unbiased estimator of $g(\underline{\theta})$ which is based on the Maximum
Likelihood Estimator (MLE)

The Maximum Likelihood function is

$$L(\theta) = \frac{1}{\sigma^n}e^{-(1/\sigma)\sum_1^n(x_i-\mu)}\;\;\; \text{if } x_{(1)} > \mu \text{ and } 0 \text{ otherwise}$$

where $x_{(1)} = \min\{x_1, x_2, \ldots x_n\}$

Clearly, $L(\theta) $ is maximum when $\mu = x_{(1)}$

$ l(\theta) = \ln(L(\theta)) = -n\ln(\sigma) – \frac1\sigma\sum_1^n(x_i-\mu)$

$\Rightarrow l_\theta = -\frac{n}\sigma + \frac{1}{\sigma^2}\sum_1^n(x_i-\mu)$

$l_\theta = 0 \Rightarrow \sigma = \frac1n\sum_1^n(x_i-x_{(1)} )$

Hence, the MLE is: $$\delta_M = \left(X_{(1)}, \frac1n\sum_1^n(X_i-X_{(1)} )\right)$$

Let $F(x)$ be distribution obtained from $f(x)$

Let $Y = X_{(1)}$ and $T = \sum_1^n(X_i-X_{(1)} )$

I want to calculate $f_Y$ and $f_T$

I know that, $f_Y(y) = n [1-F(y)]^{n-1}f(y)$ where $F$ is distribution function obtained from $f$ given in $(1)$

$$ \implies f_Y(y) = n[1-(1-e^{-(y-\mu)/\sigma})]^{(n-1)}\frac{1}{\sigma}e^{-(x-\mu)/\sigma} $$

$$\implies f_Y(y) = \frac{n}\sigma e^{-n(y-\mu)/\sigma}$$

$$\implies E(Y) = \mu + \frac{\sigma}n$$

Now, to proceed further, I need to find $f_T$.

This is where I am getting stuck. I don't know how to find $f_T $ where $T = \sum_1^n(X_i-X_{(1)} )$.

Please help me… Related information/links will be much appreciated

Best Answer

Since $X_i-\mu$ are i.i.d Exponential with mean $\sigma$ for all $i$, $X_{(1)}-\mu$ is Exponential with mean $\frac{\sigma}{n}$.

Therefore, $$E\left[\sum_{i=1}^n(X_i-X_{(1)})\right]=\sum_{i=1}^n E\left[X_i\right]-nE\left[X_{(1)}\right]=(n-1)\sigma$$

Now from $E\left[X_{(1)}-\frac{\sigma}{n}\right]=\mu$ you get an unbiased estimator of $\mu$ based on the MLE by replacing $\sigma$ with its unbiased estimator:

$$E_{\mu,\sigma}\left[X_{(1)}-\frac1{n(n-1)}\sum_{i=1}^n (X_i-X_{(1)})\right]=\mu\quad,\,\forall\,\mu,\sigma$$