[Math] Shifted exponential limiting distribution

central limit theoremconvergence-divergencestatistics

Assume we have random variables $X_1,..,X_n$ which have the pdf $f(\theta):=\exp(-(x-\theta))$ for $x>\theta$ and zero elsewhere. Let $X_{(1)} =\min(X_1,…,X_n)$ then I want to find the pdf of $S_n=X_{(1)}$ it's expected value and variances.

I calculated the pdf as $n\exp(n(x-\theta))$ and the expected value as $E(X)=\theta+1/n$ and the variance as $Var(X)=1/n^2$. Then I want to find series $S_n,a_n$ and $b_n$ such that the sequence
$$\frac{S_n-a_n}{b_n}$$ have limiting distribution with variance that is not just a number.

I found that by the central limit theorem we have that $\frac{\sqrt(n)(S_n-(\theta+1/n))}{1/n}$ converges to n(0,1) but how can I find limiting distribution?

Best Answer

The mean and variance of $S_n$ suggest to consider the normalization $$T_n=n\cdot(S_n-\theta).$$ Note that $E[T_n]=n\cdot(E[S_n]-\theta)=1$ and $\mathrm{var}(T_n)=n^2\cdot\mathrm{var}(S_n)=1$ hence $T_n$ might converge in distribution. Let us now check that this convergence holds.

Since $S_n\geqslant\theta$ almost surely, $T_n\geqslant0$ almost surely. For every $t\geqslant0$, $[T_n\geqslant t]=[S_n\geqslant\theta+t/n]$ has probability $P[X_1\geqslant\theta+t/n]^n=\mathrm e^{-t}$ hence $T_n$ converges in distribution to a standard exponential distribution. Note that actually the distribution of $T_n$ does not depend on $n$ since, for every $n$, it is exactly standard exponential.