Solved – the distribution of the sum of independent normal variables

normal distributionrandom variableself-study

Just need to check the answer for the following question:

Question
Suppose $X$ and $Y$ are two independent standard normal variables:

$X$ ~ $N(0,1)$
$Y$ ~ $N(0,1)$

What is the distribution of $X + Y$ ?

My Working
$X+Y$ ~ N($\mu_1 + \mu_2$, $\sqrt{\sigma_1^2 + \sigma_2^2})$
$X+Y$ ~ N($0 + 0$,$\sqrt{1^2+1^2}$
$X+Y$ ~ N($0$, $\sqrt{2}$)

Does this look correct?

Best Answer

To sum up the long series of comments:

Yes, your working is correct. More generally, if $X$ and $Y$ are independent normal random variables with means $\mu_X$, $\mu_Y$ respectively and variances $\sigma_X^2$ and $\sigma_Y^2$ respectively, then $aX+bY$ is a normal random variable with mean $a\mu_X+b\mu_Y$ and variance $a^2\sigma_X^2 + b^2\sigma_Y^2$.

The various comments by whuber, cardinal, myself, and the Answer by Tai Galili are all occasioned by the fact that there are at least three different conventions for interpreting $X \sim N(a,b)$ as a normal random variable. Usually, $a$ is the mean $\mu_X$ but $b$ can have different meanings.

  • $X \sim N(a,b)$ means that the standard deviation of $X$ is $b$.
    (This is the convention you are using).

  • $X \sim N(a,b)$ means that the variance of $X$ is $b$.

  • $X \sim N(a,b)$ means that the variance of $X$ is $\dfrac{1}{b}$.

Fortunately, $X \sim N(0,1)$ (which is what you asked about) means that $X$ is a standard normal random variable in all three of the above conventions!