[Math] Scaling a uniform distribution – Probability

statistics

I just have a simple question on scaling a uniform distribution.
I know that uniform distribution has probability density of $1/(b-a)$ defined on the interval a to b.

My textbook says that we can scale the distribution to be between (0,1) and have a constant density of 1 by doing the following:

Suppose X is a random variable. Then $U=(X-a)/(b-a)$ so $X=a+(b-a)U$. Thus the expected value E(X) = E($a + (b -a)U$) which equals $(a+b)/2$.

I don't understand why we subtract a from X and then divide by b – a. The intuition just doesn't make sense to me.

-How does this make it so that the distribution is defined from 0 to 1 with density 1 instead of the original definition on (a,b) with density $1/(b-a)$?

-Also, what is the mathematically correct way to derive E(U)?

Best Answer

We have $X$ which is a random variable of uniform distribution on $[a,b]$. Its expected value is the midpoint of the interval, $\frac{a+b}2$ (you can also verify it by the integral $\int_a^bx\cdot\frac1{b-a}dx$).

Now, at any possible experiment $X$ gets a concrete value, and this value is in $[a,b]$ with $1$ probability. How to move it to $[0,1]$? First move it to $[0,b-a]$ by substracting $a$, then divide by $b-a$ to get $1$ on the right.

So, this $U:=\frac{X-a}{b-a}$ is another random variable, with uniform distribution on $[0,1]$, and $E(U)=1/2$ which can also be calculated as: $$E\left(\frac{X-a}{b-a}\right)=\frac{E(X)-a}{b-a}\ .$$ (If you are not convinced, write this as integral $\int\frac{x-a}{b-a}\cdot f_X(x)dx$ and remember that $a,b$ are constants.)