Probability – Understanding Convergence in Probability

probabilitystatistics

Maximum of a Sample from a Uniform Distribution:

Suppose $X_1, … , X_n$ is a random sample for a $\mathrm{uniform}(0,\theta)$ distribution. Suppose $\theta$ is unknown. An intuitive estimate of $\theta$ is the maximum of a sample. Let $Y_n = \max\{X_1, … , X_n\}$. Exercise 5.1.4 shows that the cdf of $Y_n$ is

$F_{Y_n}(t) = 1$ if $t>\theta$, $F_{Y_n}(t) = \frac{t^n}{\theta^n}$ if $0 < t \leq \theta$, and $F_{Y_n}(t) = 0$ if $t\le0$.

Then the pdf of $Y_n$ is $f_{Y_n}(t) = \frac{nt^{n-1}}{\theta^n}$ if $0 < t \leq \theta$, and $f_{Y_n}(t) = 0$ elsewhere.

Based on its pdf, it is easy to show that $E(Y_n) = (n/(n+1))\theta$. Thus, $Y_n$ is a biased estimator $\theta$… Further, based on the cdf of $Y_n$, it is easily seen that $Y_n$ converges to $\theta$ in probability.

MY QUESTION:

How do we know that $Y_n$ converges to $\theta$ in probability? Is it because $E(Y_n) \rightarrow \theta$?

Thanks in advance.

Best Answer

The definition of convergence in probability is given below :

Let $\{X_n\}$ be a sequence of random variables on a probability space. then we say that $\{X_n\}$ convergence in probability to $\theta$ if for every $\epsilon >0$, $$Pr[|X_n-\theta| \geq \epsilon]\rightarrow0 ~\text{as}~ n\rightarrow \infty$$ and this is equivalent to $$Pr[|X_n-\theta|< \epsilon]\rightarrow 1 ~\text{as}~ n\rightarrow \infty$$

For your problem note that $$\begin{align}P[|Y_{(n)}-\theta|< \epsilon] &=P[\theta-\epsilon<Y_{(n)}<\theta+\epsilon]\\ &= F_{Y_{(n)}}(\theta+\epsilon)- F_{Y_{(n)}}(\theta-\epsilon)\\&= 1-(\frac{\theta-\epsilon}{\theta})^n, 0< \epsilon <\theta \\&=1-0,\epsilon \geq\theta \\& \rightarrow 1 ~\text{as}~ n \to \infty\end{align}$$ Hence $Y_n$ converges to $\theta$ in probability.