Convergence in cdf and pdf

convergence-divergencecumulative-distribution-functionsprobability distributionsprobability theory

This question will probably be trivial to most. However I failed to find a clear answer on the internet.

When one says that n random variables converge "in distribution", does it mean that this random variables converge to the same cumulative distributive function (cdf) ? What does it say about their convergence in probability density function (pdf) ? Can two random variables have the same cdf but different pdf or the other way around ?

Best Answer

Given a sequence of random variables, say $(X_n)_{n \in \mathbb{N}}$. We will say this sequence converges in distribution to the random variable $X$ if we take the sequence $(F_n(x))_{n \in \mathbb{N}}$ where $F_i(x)$ is the CDF of $X_i$ and have the following result

$\displaystyle \lim_{n \to \infty } F_n(x) = F(x)$ where $F(x) $ is the CDF of $X$.

Note: If we only have $m$ many random variables then we can form a sequence by taking $X_1,X_2, \ldots X_m$ and for each successive term after $X_m$, set $X_{m+j} = X_i$ where $j \in \mathbb{N}$ and $X_i \in \{ X_1, \ldots X_m \}$ and such a sequence trivially converges in distribution to $X_i$

If we have that $f_i(x)$ is the density of $X_i, \forall i$. That is if we have that the density exists for each random variable in our sequence then from Scheffé’s theorem we get that the convergence of the densities will imply convergence in distribution.

For the last question, say we have two continuous random variables of which both have a density function. Namely, say we have $X \sim F_X(z) $ and $Y \sim F_Y(z)$ then the denisties will be, respectively, $F'_X(z) = f_X(z)$ and $F'_Y(z) = f_Y(z)$.

Hence if $F_X(z) = F_Y(z)$ then $f_X(z) = f_Y(z)$. We can see the converse will hold in this case as well.