Independence of two random variables (by checking joint generating function)

generating-functionsindependenceprobability theoryrandom variablessolution-verification

In my textbook it says that two random variables $X$ and $Y$ are independent if and only if
$G_{X,Y}(s,t) = G_X(s) \cdot G_Y(t)$ (where $G$ is the probability generating function of the random variable).

I'm trying to prove this statement, this is what I have so far.

$\Rightarrow$ If $X$ and $Y$ are independent, then $G_{X,Y}(s,t) = \mathbb{E}[s^X t^Y]= \mathbb{E}[s^X]\mathbb{E}[t^Y] = G_X(s) G_Y(t)$.

$\Leftarrow$ If $G_{X,Y}(s,t) = G_X(s) G_Y(t),$ then $\sum_{k=0}^\infty \sum_{j=0}^\infty s^k t^j p_{X,Y}(k,j) = \sum_{k=0}^\infty s^k \sum_{j=0}^\infty t^j p_{X,Y}(k,j)= \sum_{k=0}^\infty s^k p_X(k) \sum_{j=0}^\infty t^j p_Y(j).$

Am I able to directly conclude that $p_{X,Y}(k,j) = p_X(k)p_Y(j)$ from the equation above? I know that I have to somehow equate the coefficients and deduce that the joint distribution is the product of the marginals. I'm just not quite sure how to proceed.

Best Answer

$\def\eq{\,{=}\,}$The probability mass function of a discrete integer-valued random variable is recovered by taking derivatives of its probability generating function, vis:$$\begin{align}\mathsf P(X\eq k) &=\dfrac{\mathsf G_X^{(k)}(0)}{k!}&&\big[k\in\Bbb N\big]\\[1ex]\mathsf P(Y\eq j)&=\dfrac{\mathsf G_Y^{(j)}(0)}{t!}&&\big[j\in\Bbb N\big]\\[1ex]\mathsf P(X\eq k,Y\eq j)&=\dfrac{\mathsf G_{X,Y}^{(k,j)}(0,0) }{k!~j!}&&\big[k\in\Bbb N,j\in\Bbb N\big]\end{align}$$


So your task is to show that: $$\mathsf G_{X,Y}(s,t)=\mathsf G_X(s)\cdot\mathsf G_Y(t)\implies \mathsf G_{X,Y}^{(k,j)}(0,0)=\mathsf G_X^{(k)}(0)\cdot\mathsf G_Y^{(j)}(0)$$

$~\\~$


$\small\mathsf G_Z^{(i)}(0)\mathop{:=}\left.\dfrac{\mathrm d^i\mathsf G_Z(r)}{\mathrm d r~^i}\right\vert_{r=0}$

Related Question