What is the reason that (why) $$\mathbb{E}\left[ X \right]=0,\, \operatorname{var}\left[ X \right]=1 \Leftrightarrow \mathbb{E}\left[ X^{2} \right]=1$$
Expectation of a random variable squared
expected valueprobability
Related Solutions
I think the key point I was missing was that your $X_{i}$ 's are identically distributed. That means moments like expectation and variance to not depend on $i$. We have $E X_{1}=E X_{2}=\cdots=E X_{n}$, so we can replace every instance of $E X_{i}$ with $E X_{1}$. The same is true for variances. The $\hat{\mu}_{n}$ is the sample average of $X_{1}, \ldots, X_{n}$, and $E X$ is the expectation. For discrete $X$, this is $\sum_{j} P\left(X=x_{j}\right) x_{j}$ but for more general $X$ this is generally an integral.
You seem to be confusing the weighted sum of random variables $$Y = \pi X_1 + (1-\pi)X_2 \tag{1}$$ and a mixture density of random variables $$f_Y(y) = \pi f_{X_1}(y) + (1-\pi) f_{X_2}(y). \tag{2}$$
These are not the same thing. The first one is a weighted sum of the outcomes; the second one is a weighted sum of the probability density functions. Judging from your notation, you seem to be referring to the latter, but your method of solution seems to imply the former.
For the sake of clarity, suppose $X_1$ and $X_2$ are independent random variables, and have means $\mu_1$, $\mu_2$ and variances $\sigma_1^2$, $\sigma_2^2$, respectively. Let $f_{X_1}$ and $f_{X_2}$ be their probability densities. Then by linearity of expectation, the weighted sum in $(1)$ obeys $$\operatorname{E}[Y] = \pi \operatorname{E}[X_1] + (1-\pi) \operatorname{E}[X_2] = \pi \mu_1 + (1-\pi) \mu_2, \tag{3}$$ $$\operatorname{Var}[Y] \overset{\text{ind}}{=} \pi^2 \operatorname{Var}[X_1] + (1-\pi)^2 \operatorname{Var}[X_2] = \pi^2 \sigma_1^2 + (1-\pi)^2 \sigma_2^2. \tag{4}$$ Note the variance calculation is linear because $X_1$ and $X_2$ are independent. From these, the second moment is $$\begin{align} \operatorname{E}[Y^2] &= \operatorname{Var}[Y] + \operatorname{E}[Y]^2 \\ &= \pi^2 \sigma_1^2 + (1-\pi)^2 \sigma_2^2 + (\pi \mu_1 + (1-\pi) \mu_2)^2 \\ &= \pi^2 (\mu_1^2 + \sigma_1^2) + (1-\pi)^2 (\mu_2^2 + \sigma_2^2) + 2\pi(1-\pi)\mu_1 \mu_2. \tag{5} \end{align}$$
Now let us consider the mixture density case. We have
$$\begin{align} \operatorname{E}[Y] &= \int_{y = -\infty}^\infty y \left(\pi f_{X_1}(y) + (1-\pi) f_{X_2}(y)\right) \, dy \\ &= \pi \int_{y=-\infty}^\infty y f_{X_1}(y) \, dy + (1-\pi) \int_{y=-\infty}^\infty y f_{X_2}(y) \, dy \\ &= \pi \operatorname{E}[X_1] + (1-\pi) \operatorname{E}[X_2] \\ &= \pi \mu_1 + (1-\pi) \mu_2. \tag{6} \end{align}$$
So the mean of the mixture distribution is the same as the weighted sum. But what about the second moment? You should already be able to see how it will turn out:
$$\begin{align} \operatorname{E}[Y^2] &= \int_{y = -\infty}^\infty y^2 \left( \pi f_{X_1}(y) + (1-\pi) f_{X_2}(y) \right) \, dy \\ &= \pi \int_{y=-\infty}^\infty y^2 f_{X_1}(y) \, dy + (1-\pi) \int_{y=-\infty}^\infty y^2 f_{X_2}(y) \, dy \\ &= \pi \operatorname{E}[X_1^2] + (1-\pi) \operatorname{E}[X_2^2] \\ &= \pi (\mu_1^2 + \sigma_1^2) + (1-\pi) (\mu_2^2 + \sigma_2^2). \tag{7} \end{align}$$
Now, you might object: why would we not have $$\operatorname{E}[Y^2] \overset{?}{=} \int_{y=-\infty}^\infty y^2 \left( \pi f_{X_1}(y) + (1-\pi) f_{X_2}(y)\right)^2 \, dy?$$ Remember, $Y$ has density according to $(2)$, so the moments of $Y$ are given by $$\operatorname{E}[Y^k] = \int_{y=-\infty}^\infty y^k f_Y(y) \, dy.$$ You don't square the density. So now it is plainly obvious that $(1)$ and $(2)$ do not mean the same thing.
As an exercise for the reader, compute the variance of the mixture distribution; i.e., $\operatorname{Var}[Y]$ when $(2)$ is true (and again, independence of $X_1$ and $X_2$ holds).
Best Answer
Use the fact that $\color{blue}{\operatorname{var}(X) = \Bbb{E}\left[X^2\right] - \left(\Bbb{E}[X]\right)^2}$. So if $\Bbb{E}[X]=0$, then $\operatorname{var}(X) = \Bbb{E}\left[X^2\right] $.