Expectation of sum of random variables

expected valuestatisticsvariance

I'm having trouble proving the following lemma for my statistics course:

Let $X_1,…,X_n$ be a random sample from $P$ to $\mathbb{R}$, $X$~$P$, $g$ measurable so that $\mathrm{E}g(X)$ and $\mathrm{var}g(X)$ exist. Then

$\mathrm{E}(\sum_{i=1}^ng(X_i))=n\cdot\mathbb{E}g(X)$

$\mathrm{var}(\sum_{i=1}^ng(X_i))=n\cdot\mathrm{var}g(X)$

I have a vague conception of its proof and know that it is directly related to $\sum_{i=1}^{n}{X_i}=n\bar{X}$ and correspondingly the random variables being i.i.d.

Best Answer

The first is simply linearity of expectation applied to $g(X_i)$. So $\mathbb{E}[\sum_{i=1}^ng(X_i)] = \sum_{i=1}^n \mathbb{E}[g(X_i)] = n\cdot\mathbb{E}[g(X)]$

Assuming the $X_i$ are independent then so too are the $g(X_i)$, meaning $\mathbb{E}[g(X_i)g(X_j)]=\mathbb{E}[g(X_i)]\mathbb{E}[g(X_j)]=(\mathbb{E}[g(X)])^2$ when $j \not = i$, so:

$\mathrm{var}(\sum_{i=1}^ng(X_i)) \\= \mathbb{E}[(\sum_{i=1}^ng(X_i))^2] - (\mathbb{E}[\sum_{i=1}^ng(X_i)])^2 \\= \sum_{i=1}^n\mathbb{E}[(g(X_i)^2)]+\sum_{i=1}^n\sum_{j\not=i}\mathbb{E}[g(X_i)g(X_j)] - (n\cdot\mathbb{E}[g(X)])^2 \\=n\cdot\mathbb{E}[(g(X)^2)]+n(n-1)(\mathbb{E}[g(X)])^2 - n^2\cdot(\mathbb{E}[g(X)])^2 \\=n\cdot\mathbb{E}[(g(X)^2)] - n\cdot(\mathbb{E}[g(X)])^2 \\=n\cdot\mathrm{var}(g(X))$

Related Question