Variance of sum of weighted gaussian random variable

normal distributionprobabilityprobability theoryvariance

This problem comes from section 3.2. at page 7 of this paper
Suppose there are $N$ independent gaussian random variables $z_1,z_2,z_3,…,z_N$. That is
$$z_i \sim N(\mu _i,\sigma _i^2).$$
Now let
$$z=\frac{1}{S}\sum_{i=1}^N w_iz_i,S=\sum_{i=1}^N w_i.$$
My question is how to show
\begin{align}
Var(z)&=\frac{1}{S}\sum_{i=1}^N w_i^2(\mu_i^2+\sigma_i^2)-{\bar z}^2, \\
\bar z &= \frac{1}{S}\sum_{i=1}^N w_i\mu_i.
\end{align}

To my understanding, $Var(z)=\bar {z^2}- {\bar z}^2$. But I am stuck on deriving $\bar {z^2}=\frac{1}{S}\sum_{i=1}^N w_i^2(\mu_i^2+\sigma_i^2)$, can anyone help out?

Update
I guess that I abstracted the problem in an erroneous way. In effect, it is an instance of gaussian mixture, so the probability density function is as follows:
$$f(z) = \frac{1}{S}\sum_{i=1}^N w_i \cdot N(\mu_i,\sigma_i^2)$$
where $N(\mu_i,\sigma_i^2)=\frac{1}{\sqrt{2*\pi}\sigma}\exp(-\frac{(z-\mu_i)^2}{2\sigma_i^2})$, according to the definition of 2nd moment:
$$\bar {z^2} = \mathbb E_f(z^2)=\int z^2f(z)dz= \frac{1}{S}\sum_{i=1}^N w_i(\mu_i^2+\sigma_i^2).$$
This answer formulated a similar problem in a more canonical way.

Best Answer

$$E(z) = \frac{1}{S} w_i E(z_i) = \frac{1}{S} w_i \mu_i = \bar z$$ $$\text{Var}(z) = E(z^2) - \bar z^2 \tag{1}$$ where $$E(z^2) = \Big(\frac{1}{S}\sum_{i=1}^N w_iz_i \Big)^2 = \frac{1}{S^2}\sum_{j=1}^N\sum_{i=1}^N w_iw_jE(z_iz_j) $$ Notice that, we have that $$E(z_i^2) = (\sigma_i^2 + \mu_i^2) $$ and under the assumption of independent random variables $$E(z_iz_j) = E(z_i)E(z_j) = \mu_i\mu_j $$ This gives us $$E(z^2) = \frac{1}{S^2}\sum_{j=1}^N w_i^2 (\sigma_i^2 + \mu_i^2) + \frac{1}{S^2}\sum_{i \neq j} w_iw_j \mu_i\mu_j \tag{2}$$ Replacing $(2)$ in $(1)$ we get $$\text{Var}(z) = \frac{1}{S^2}\sum_{j=1}^N w_i^2 (\sigma_i^2 + \mu_i^2) + \underbrace{\frac{1}{S^2}\sum_{i \neq j} w_iw_j \mu_i\mu_j}_C- \bar z^2$$ You need the additional assumption of zero mean to say that $C = 0$ and to arrive at your expression.

Related Question