Beta distribution with parameters $\alpha = \beta \to 0$ is Bernoulli distribution

beta functiongamma functionprobabilityprobability distributions

In the article https://en.wikipedia.org/wiki/Beta_distribution#Symmetric_(α_=_β) it is said that a Beta distribution with parameters $\alpha = \beta \to 0$ has a Bernoulli distribution with probability $p=0.5$ at $0$ and $1$.

Formally, does this mean that the sequence $\text{Beta}(1/n, 1/n)$ with $n \in \mathbb{N}$ converge to $\text{Bernoulli}(0.5)$ and what kind of convergence it is (distribution, a. e.)?

How to prove this fact?

Beta density function when $\alpha = \beta$ is:

$$f_X(x)=\frac{x^{\alpha -1} (1-x)^{\beta -1}}{B(\alpha, \beta)} = \frac{\Gamma(2\alpha)\;x^{\alpha -1} (1-x)^{\alpha -1}}{\Gamma(\alpha)^2}$$

then I need to find

$$\lim_{\alpha^+ \to 0}\frac{\Gamma(2\alpha)\;x^{\alpha -1} (1-x)^{\alpha -1}}{\Gamma(\alpha)^2}$$

where $\lim_{\alpha^+ \to 0}$ is the limit from the right since the parameters for the Beta distribution are real positive numbers. We know that $\lim_{\alpha^+ \to 0} \Gamma(\alpha) = +\infty$ and for every $x \in \mathbb{R}$ $x^{-1}$ and $(1-x)^{-1}$ is another real number, then we have a limit of the form $\infty / \infty$ and we can use L'Hopital, but I dont know how to use it with the $\Gamma$ function and how to conclude that the new function is a density function of Bernoulli distribution.

Best Answer

I find a theorem that I think can answer all the questions

Theorem: If $X$ is a random variable determined by its moments and $\{ X_n \}_{n \in \mathbb{N}}$ is a sequence of random variables such that $E[X_n^k] \to E[X^k]$ then $X_n \to X$ in distribution.

Bernoulli distribution is determined by its moments (we can verify with Carleman's condition). The moments of a Bernoulli distribution $X$ with parameter $p$ are $E(X^k)=p$ and the moments for a $Y \sim \text{Beta}(\alpha, \beta)$ are:

$$ E(Y^k) = \frac{\alpha(\alpha+1) \cdots (\alpha+k-1)}{(\alpha+\beta)(\alpha+\beta+1) \cdots (\alpha+\beta+k-1) }$$

then if $\alpha = \beta = 1/n \to 0$, $n \in \mathbb{N}$, and $X_n = \text{Beta}(1/n,1/n)$ we have:

$$ lim_{n \to \infty}E(X_n^k)=\lim_{a^+ \to 0}\frac{\alpha(\alpha+1) \cdots (\alpha+k-1)}{(2\alpha)(2\alpha+1) \cdots (2\alpha+k-1) } = \lim_{a^+ \to 0}\frac{(\alpha+1) \cdots (\alpha+k-1)}{(2)(2\alpha+1) \cdots (2\alpha+k-1) } $$ $$=\frac{(k-1)!}{2(k-1)! }= \frac{1}{2} $$

By the theorem, we have $\{X_n\}$ converges to Bernoulli with parameter $p=1/2$.

Related Question