Variance of max of $m$ i.i.d. random variables

extreme-value-analysisextreme-value-theoremprobability theoryvariance

I'm trying to verify if my analysis is correct or not.

Suppose we have $m$ random variables $x_i$ , $i \in m$. Each $x_i \sim \mathcal{N}(0,\sigma^2)$.

From extreme value theorem one can state $Y= \max\limits_{i \in m} [\mathcal{P}(x_i \leq \epsilon)] = [G(\epsilon)]$ as $m\to\infty$, if $x_i$ are i.i.d and $G(\epsilon)$ is a standard Gumbel distribution.

My first question is can we state that: $$\text{Var}[Y]= \text{Var}\left[\max_{i \in m} [\mathcal{P}(x_i \leq \epsilon)] \right]= \text{Var}[ [G(\epsilon)]] = \frac{\pi^2}{6}$$

My second question is, if we have $n$ of such $Y$ but all of them are independent with zero mean, can we state:
$$\text{Var}\left[\prod_{i}^n Y_i\right] = \left(\frac{\pi^2}{6}\right)^n$$

Thanks.

Update:
There's final result for the second point at Distribution of the maximum of a large number of normally distributed random variables but no complete step by step derivation.

Best Answer

$\def\dto{\stackrel{\mathrm{d}}{→}}\def\peq{\mathrel{\phantom{=}}{}}$The answers to question 1 and 2 are both negative.

For question 1, since $Y_m = \max\limits_{i \in m} P(x_i \leqslant ε) \dto G(ε)$, then $\color{blue}{\lim\limits_{m → ∞}} D(Y_m) = D(G(ε))$, i.e. the equality is in the sense of a limiting process.

For question 2, it's geneally not true that$$ D\left( \prod_{m = 1}^n Y_m \right) = \prod_{m = 1}^n D(Y_m) $$ for i.i.d. $Y_1, \cdots, Y_n$, especially when $E(Y_1) ≠ 0$ and $D(Y_1) > 0$ by the following proposition:

Proposition: If $X$ and $Y$ are independent random variables on the same probability space, then$$ D(XY) - D(X) D(Y) = D(X) (E(Y))^2 + D(Y) (E(X))^2. $$

Proof: Since $X$ and $Y$ are independent, then\begin{gather*} D(XY) = E(X^2 Y^2) - (E(XY))^2 = E(X^2) E(Y^2) - (E(X) E(Y))^2,\\ D(X) D(Y) = \left( E(X^2) - (E(X))^2 \right) \left( E(Y^2) - (E(Y))^2 \right)\\ = E(X^2) E(Y^2) - E(X^2) (E(Y))^2 - E(Y^2) (E(X))^2 + (E(X))^2 (E(Y))^2, \end{gather*} and\begin{align*} &\peq D(XY) - D(X) D(Y) = E(X^2) (E(Y))^2 + E(Y^2) (E(X))^2 - 2 (E(X))^2 (E(Y))^2\\ &= \left( E(X^2) - (E(X))^2 \right) (E(Y))^2 + \left( E(Y^2) - (E(Y))^2 \right) (E(X))^2\\ &= D(X) (E(Y))^2 + D(Y) (E(X))^2. \tag*{$\square$} \end{align*}

Now it can be proved by induction on $n$ with the above proposition that$$ D\left( \prod_{m = 1}^n Y_m \right) > \prod_{m = 1}^n D(Y_m) > 0. $$

Related Question