Moment Generating Function of beta ( Hard )

integrationmoment-generating-functionsself-learningstatistical-inference

Given $X$ is a random variable ~ $Beta ( a , b)$ distribution and
$X$ belongs in (0,1)

Does the (MGF ) $E[e^{tx}]$ exist for every value of $a , b$ ?

(Mgf must not be equal to infinity in order to exist)

thus , is $E[e^{tx}]$ finite ?

Update

what if $Beta ( a = \frac{1}{2} , b =1 ) $

the moment generating function is calculated as below

$ M_X(t) $ = $\mathbb{E}[e^{tX}]$ =$ \frac{\Gamma(\frac{1}{2} +1)}{\Gamma(\frac{1}{2} ) +\Gamma(1)} \int_0^1 e^{tX} x^{\frac{1}{2}-1} (1-x)^{1-1}\ dx $=

After – Using Taylor series expansion
& interchanging the summation and integration of the taylor series

$ M_X(t) $ = $ \sum_{k=0}^\infty \frac{t^k}{k! (2k+1)}$

where k is non negative integer and $t \in \mathbb{R}$

How can I prove that this last sum is finite or infinite , is there a theorem ?
(my math background is limited)

Best Answer

For any random variable $X$ with $|X| \leq C$ with probability $1$ we have $Ee^{tX}\leq eE^{|t||X|} \leq e^{C|t|} <\infty$ for any real number $t$. In particular, Beta random variables are bounded so their MGF's exist.

About the series $\sum \frac {t^{k}} {(k!)(2k+1)}$ just observe that $|\frac {t^{k}} {(k!)(2k+1)}| <\frac {|t|^{k}} {(k!)}$ so the series is convergent for all $t$. ($\sum \frac {|t|^{k}} {(k!)}=e^{|t|}$).