Statistics – Kurtosis of Sum of Independent Random Variables

estimationprobability theorystatistics

Suppose that $X$ and $Y$ are independent random variables with different expected values and variances. Suppose we define kurtosis as

$$Kurt(X)=\frac{E[(X- \mu)^4]}{E[(X- \mu)^2]^2}$$

My question is what is $Kurt(X+Y)=?$

Wikipedia claims that
$$\operatorname{Kurt}\left(\sum_{i=1}^n X_i \right) = \sum_{i=1}^n \frac{\sigma_i^{\,4} \cdot \operatorname{Kurt}(X_i)}{\left( \sum_{j=1}^n \sigma_j^{\,2} \right)^2}$$
I have not been able to show this. I also have seen it repeatedly fail in simulation. By this I mean if I simulate two independent random variables and apply Wikipedia's formula it never holds where well known results like $Var(X+Y) =Var(X) +Var(Y)$ always hold. This leads me to believe that the wiki result is wrong. Can some prove what the result should be?

Best Answer

Suppose that $X$ and $Y$ are independent, then $$\text{Kurt} (X+Y) = \frac{E\left[\left(X+Y-\left(\mu _X+\mu _y\right)\right){}^4\right]}{E\left[\left(X+Y-\left(\mu _X+\mu _Y\right)\right){}^2\right]^2} = \frac{\left.\left.E\left[(X-\mu _X+Y-\mu _y\right.\right){}^4\right]}{\text{Var}(X+Y)^2} = \frac{E\left[\left(X-\mu _X\right){}^4+4 \left(X-\mu _X\right){}^3 \left(Y-\mu _Y\right)+6 \left(X-\mu _X\right){}^2 \left(Y-\mu _Y\right){}^2+4 \left(X-\mu _X\right) \left(Y-\mu _Y\right){}^3+\left(Y-\mu _Y\right){}^4\right]}{(\text{Var}(X)+\text{Var}(Y))^2} = \frac{6 \sigma _x^2 \sigma _y^2+E\left[\left(X-\mu _X\right){}^4\right]+E\left[\left(Y-\mu _Y\right)^4\right]}{\left(\sigma _x^2+\sigma _y^2\right)^2}=\frac{\text{Kurt} (X) \sigma _x^4+\text{Kurt}(Y) \sigma _y^4+6 \sigma _x^2 \sigma _y^2}{\left(\sigma _x^2+\sigma _y^2\right){}^2}$$

which I believe shows that result on Wikipedia is wrong.

Related Question