[Math] Cauchy CDF derivation from standard normal

probabilityprobability distributions

I'm studying Probability, from the book "Introduction to probability" by Joseph K. Blitzstein and Jessica Hwang page 294 talks about Cauchy CDF, it says:

Let $X$ and $Y \sim N(0,1)$ (Standard Normal) and let $T = \frac{X}{Y}$. The distribution of $T$ is called Cauchy Distribution.

The CDF of $T$ is:

$$F_T(t) = P(T \le t) = P\Big(\frac{X}{Y} \le t\Big) = P\Big(\frac{X}{|Y|} \le t\Big)$$

since the r.v.s $\frac{X}{Y}$ and $\frac{X}{|Y|}$ are identically distributed by the symmetry of the standard Normal distribution.

I have few questions:

  1. Random variable has some similarity with function which we can manipulate it for example if I have function in general(like a high school math function) ex. $f(x) = x+2$ and another $g(x) = x+2$ then if I do $\frac{f(x)}{g(x)} =1$ I get $1$. lets talk about the r.v.s $X$ and $Y$, if it is identical shouldn't I get $\frac{X}{Y} = 1$?

  2. Why $P(\frac{X}{Y} \le t) = P(\frac{X}{|Y|} \le t)$ ? I know symmetry of standard normal talks about if $Z$ has standard normal distribution then $-Z$ and $Z$ has the same distribution but in this case it is the absolute value of $Y$, what does that mean? When we talk about manipulating r.v.s ex. $Y = X-1$ it is to minus all the support of $X$ by $1$ to get distribution of $Y$ but what about the r.v.s divide by another r.v.s? Do we think of it as a function that already crystallised into a number? Or how we think about it? I really have no clue about what this is about.

Please give me detail and step by step answer. I'm a newbie.

Best Answer

  1. You are right, a real random variable is a function from a set $\Omega$ to $\mathbb{R}$. Therefore, if $X(\omega) = Y(\omega) \neq 0$, then $X/Y(\omega) = 1$. Here, something else is assumed: it is assumed that $X$ and $Y$ are identically distributed with normal distribution $\mathcal{N}(0,1)$, but we don't have $X=Y$! This may be interpreted in terms of probabilities: $$\int_A f_X(x)\, dx = P(X(\omega)\in A) = P(Y(\omega)\in A) = \int_A f_Y(x)\, dx$$ for any given subset $A$ of $\mathbb{R}$, which does not imply that $P(X/Y(\omega)\in A) = P(1\in A)$. In other words, if two dices have the same probability distribution, there is no reason that the ratio of results is always equal to one. The ratio of results is a new random variable with its own probability distribution. In the case of two dices, we have $$\frac{X(\omega)}{Y(\omega)} \in \left\lbrace 1, \frac{1}{2},\frac{1}{3},\dots, \frac{6}{5}\right\rbrace , $$ where $X(\omega) \in \left\lbrace 1, 2,\dots, 6\right\rbrace$ and $Y(\omega) \in \left\lbrace 1, 2,\dots, 6\right\rbrace$ denote the result of each dice throw.

  2. The probability $P( X/Y\leqslant t)$ may be viewed as \begin{aligned} &P(\lbrace X/Y\leqslant t, Y>0 \rbrace\cup\lbrace X/Y\leqslant t, Y<0 \rbrace) \\ =\; &P(X/|Y|\leqslant t, Y>0) + P(-X/|Y|\leqslant t, Y<0)\\ =\; &P(X/|Y|\leqslant t, Y>0) + P(X/|Y|\leqslant t, Y<0)\\ =\; &P(X/|Y|\leqslant t) \end{aligned} by symmetry of the standard normal distribution, independence and the fact that $P(Y=0)=0$.

Related Question