Suppose $X$ is a random variable with mean $0$ and variance $\sigma_x^2$.
How can I calculate mean and variance of $X^2$?
I calculated the mean like this
\begin{equation*}
\operatorname{Var}(X) = \operatorname{E}(X^{2}) – \operatorname{E}^{2}(X) \rightarrow \operatorname{E}(X^{2}) = \sigma_{x}^2
\end{equation*}
but am stuck at the variance.
Best Answer
If you have only the mean and variance of $X$ as 0 and $\sigma_x^2$, then there is insufficient information to calculate the variance of $X^2$, which is
$$ E[X^4] - E[X^2]^2 = E[X^4] - \sigma_x^4. $$
For a normal R.V. $\sim N(0, \sigma_x^2)$, $E[X] = 0, E[X^2] = \sigma_x^2, E[X^4] = 3 \sigma_x^4$.
For a uniform R.V. $\sim U(-1.5 \sigma_x, 1.5 \sigma_x)$, $E[X] = 0, E[X^2] = \sigma_x^2, E[X^4] = 9 / 5 \sigma_x^4$.
It's easy to build distributions for which this is infinite.