Question about proof of MGF of sample mean (trivial double-check)

moment-generating-functionsrandom variablessamplingstatistical-inferencestatistics

There is a part of the proof that I am unsure about for the following theorem:

Say you take a random sample of a random variable $X$ with probability distribution $\mathbb{P}$, i.e. $X\sim\mathbb{P}$. Now, suppose that this random variable $X$ has a Moment Generating Function $M_x$ on an interval $I$.

What this theorem says is that $M_{\bar{X}}(t)$, the Moment Generating Function of the sample mean at $t$, is equal to $(M_X(t/n))^n$, the Moment Generating Function of the actual random variable at $t$ divided by the number of samples, to the power of the number of samples, for any $t/n$ in the interval $I$.

Here is the proof: fix an arbitrary $t$ so that $t/n\in I$. Then $M_{\bar{X}}(t)=\mathrm{E}(e^{t\bar{X}})=\Pi_{i=1}^n\mathrm{E}(e^{t/n X_i})=(M_X(t/n))^n$.

Straightforward enough, but could somebody double-check my logic behind the last equal sign? Is it just the definition of a random sample being a collection of i.i.d. random variables?

Also, is defining $t/n\in I$ related to how the MGF must exist in a neighborhood of 0?

Best Answer

\begin{align} M_{\bar{X}}(t) &= E[e^{t\bar{X}}] & \text{defn. of $M_{\bar{X}}$} \\ &= E\left[\prod_{i=1}^n e^{\frac{t}{n} X_i}\right] & \text{$\bar{X} = \frac{1}{n} \sum_{i=1}^n X_i$} \\ &= \prod_{i=1}^n E[e^{\frac{t}{n} X_i}] & \text{$X_i$ are independent} \\ &= \prod_{i=1}^n M_{X_i}(t/n) & \text{defn. of $M_{X_i}$} \\ &= (M_{X}(t/n))^n & \text{$X_i$ are identically distributed, so $M_{X_i} = M_X$} \end{align}

Related Question