Solved – Marginal normality and joint normality

mathematical-statisticsmultivariate analysisnormal distribution

Let $X$ and $Y$ be two independent standard normally distributed
random variables $N(0,1)$ .If we define a new random variable $Z$ such
that : $$Z = \begin{cases}X & \text{if} &XY > 0\\ -X & \text{if} & XY <
0\end{cases}$$ Prove that the joint distribution of $Z$ and $Y$ is not
bivariate normal by trying to show that $Z$ and $Y$ always have the
same sign .

In fact I could prove that $Z$ and $Y$ have the same sign , since I proved that $Z$ is standard normally distributed $N(0,1)$ and by definition of $Z$ , $Z > 0 $ if either $X<0$ and $Y<0$ or $X>0$ and $Y>0$ .

But I did not get the idea of that proof , why the joint distribution of $Y$ and $Z$ will not be bivariate normal as long as $Y$ and $Z$ have the same sign ?
I know that the normality does not imply joint normality , I just want to know the relation between the sign of the random variables and the joint bivariate normal distribution ?

Best Answer

A bivariate normal, centered anywhere in the YZ-plane, must exist over the entire plane...that is, in quadrants I, II, III, and IV because its domain is infinite in both variables.

But for it to exist in Quadrants II and IV, Y and Z must be oppositely-signed. If they cannot be, then the joint distribution cannot be a bivariate normal.

Related Question