Mathematical Statistics – Proof That Moment Generating Functions Uniquely Determine Probability Distributions

mathematical-statisticsmoment-generating-functionmomentsproofreferences

Wackerly et al's text states this theorem "Let $m_x(t)$ and $m_y(t)$ denote the moment-generating functions of random variables X and Y, respectively. If both moment-generating functions exist and $m_x(t) = m_y(t)$ for all values of t, then X and Y have the same probability distribution." without a proof saying its beyond the scope of the text. Scheaffer Young also has the same theorem without a proof. I don't have a copy of Casella, but Google book search didn't seem to find the theorem in it.

Gut's text seems to have an outline of a proof, but doesn't make reference to the "well-known results" and also requires knowing another result whose proof is also not provided.

Does anyone know who originally proved this and if the proof is available online anywhere? Otherwise how would one fill in the details of this proof?

In case I get asked no this is not a homework question, but I could imagine this possibly being someone's homework. I took a course sequence based on the Wackerly text and I have been left wondering about this proof for some time. So I figured it was just time to ask.

Best Answer

The general proof of this can be found in Feller (An Introduction to Probability Theory and Its Applications, Vol. 2). It is an inversion problem involving Laplace transform theory. Did you notice that the mgf bears a striking resemblance to the Laplace transform?. For use of Laplace Transformation you can see Widder (Calcus Vol I) .

Proof of a special case:

Suppose that X and Y are random varaibles both taking only possible values in {$0, 1, 2,\dots, n$}. Further, suppose that X and Y have the same mgf for all t: $$\sum_{x=0}^ne^{tx}f_X(x)=\sum_{y=0}^ne^{ty}f_Y(y)$$ For simplicity, we will let $s = e^t$ and we will define $c_i = f_X(i) − f_Y (i)$ for $i = 0, 1,\dots,n$.

Now $$\sum_{x=0}^ne^{tx}f_X(x)-\sum_{y=0}^ne^{ty}f_Y(y)=0$$ $$\Rightarrow \sum_{x=0}^ns^xf_X(x)-\sum_{y=0}^ns^yf_Y(y)=0$$ $$\Rightarrow \sum_{x=0}^ns^xf_X(x)-\sum_{x=0}^ns^xf_Y(x)=0$$ $$\Rightarrow\sum_{x=0}^ns^x[f_X(x)-f_Y(x)]=0$$ $$\Rightarrow \sum_{x=0}^ns^xc_x=0~∀s>0$$ The above is simply a polynomial in s with coefficients $c_0, c_1,\dots,c_n$. The only way it can be zero for all values of s is if $c_0=c_1=\cdots= c_n=0$.So, we have that $0=c_i=f_X(i)−f_Y(i)$ for $i=0, 1,\dots,n$.

Therefore, $f_X(i)=f_Y(i)$ for $i=0,1,\dots,n$.

In other words the density functions for $X$ and $Y$ are exactly the same. In other other words, $X$ and $Y$ have the same distributions.