[Math] Recovering random variable from its moments

probability

The problem is: can you recover a distribution of random variable if you know all its moments?

My first guess was to use moment-generating function (MGF). It is known that if two random variables have the same MGF, then they have equal distributions. So if $M(X) = 1 + \sum_{n = 1}^\infty \mathbb{E} X^n \frac{t^n}{n!}$ converges, then it uniquely corresponds to some distribution; we can apply inverse Laplace transform and get the PDF of X, which solves the problem.
But there are some issues:

  1. MGF may not exist.

  2. Even if MGF exists, the approach with Laplace transform works only with continuous random variables.

Can you help me with this problem? Thank you!

Best Answer

The characteristic function may be what you need. It is connected closely to MGF but it always exists and there is one-to-one correspondence between characteristic functions and cumulative distributions. Look for example into quoted Wiki article.

Related Question