Solved – Kurtosis/4th central moment in terms of mean and variance

kurtosismeanmomentsvariance

Is it possible to express the kurtosis $\kappa$, or the 4th central moment $\mu_4$, of a random variable $X$ in terms of its mean $\mu = E(X)$ and variance $\sigma^2 = Var(X)$ only, without having to particularize to any distribution?

I mean, an expression like $\kappa = f(\mu, \sigma^2)$ or $\mu_4 = g(\mu, \sigma^2)$, valid for any distribution, where $f(\mu, \sigma^2)$ and $g(\mu, \sigma^2)$ are functions of the mean $\mu$ and variance $\sigma^2$.

P.S.: Some comments on my attempts.

$\kappa$ is related to $\mu_4$, and $\mu_4 = E(X^4) – 4\mu E(X^3) + 6\mu^2 E(X^2) – 3\mu^4$.

The term $E(X^2)$ can be expressed as $E(X^2) = \mu^2 + \sigma^2$ but I didn't manage to find the way to express $E(X^3)$ and $E(X^4)$ in terms of $\mu$ and $\sigma^2$.

Best Answer

What you think about here is something like a philosopher's stone of statistics.

The strict answer is:

No, it is impossible to express skewness or kurtosis via the mean and variance.

@Macro gave a counterexample of distributions with different skewness and kurtosis. A question of coming up with distributions for the given set of moments has entertained statisticians since the very early ages, and Pearson's system of frequency curves is one of the examples of how one could come up with a continuous distribution for the numeric values of the first four moments. You could also look at the moment generating function $m(t)={\rm E}[\exp(Xt)]$, a characteristic function $\phi(t)={\rm E}[\exp(iXt)]$, or a cumulant generating function $\psi(t) = \ln \phi(t)$. With some luck, you can try putting your four moments into them and invert these functions to obtain explicit expression of the densities. Finally, you can always find a distribution with discrete support on five points to satisfy the five equations for the moments of order 0 through 4 by solving a corresponding system of nonlinear equations.

To express the higher order moments via the lower order moments, you need to know the shape of the distribution and its parameters. For one-parameter (Poisson, exponential, geometric) or two-parameter (normal, gamma, binomial) distributions, you can express the higher order moments via the natural parameters of these distributions; e.g., for a Poisson with rate $\lambda$, skewness is $\lambda^{-1/2}$, and kurtosis is $\lambda^{-1}$ (sanity check: both going to zero as $\lambda \to \infty$, providing a normal approximation for Poisson for large $\lambda$). But these exceptions should not fool you; for more interesting distributions, including anything from the real world, you can just forget about doing anything meaningful with the kurtosis.