[Math] derivative of generating function for calculating expected value intuition

generating-functionsintuitionrandom variablesstatistics

I'm currently studying discrete random variables and I am on using generating functions for calculating the expected value.

Given that a generating function is a polynomial of the following kind $g_x(s)=\sum_{k=0}^n P(X=k)s^k$, where the coefficient in front the kth power of s is the probability that the random variable X equals k, we can calculate the expected value for the discrete random variable using the derivative of the generating function at 1.

$
EX = g_x'(1)
$

I understand the proof for this, but I can't seem to get the intuition behind it.

My questions are:

  • What is the intuitive understanding of this? How was this result achieved?
  • Is there a visual way to interpret the statement? The generating function is a polynomial, so how come the first derivative is EX and for the variance is $g''_x(1)+g'_x(1)−g'_x(1)^2$?
  • If getting intuition for this statement requires more work, what prerequisite theory would you advice me to get to in order to understand it?

Thanks in advance!

Best Answer

Another approach that brings some extra insight is that the definition for moment generating functions is the same as those of Laplace (for continuous rvs) and Z transforms (for discrete rvs).

This leads to the connection between derivatives of the moment generating function (time domain) and moments of the rv (laplace domain) and can be useful for finding direct MGFs in tables.

Related Question