[Math] How does a uniform distribution have a mean

probability distributionsrandomrandom variablesstatisticsuniform distribution

I believe a continuous uniform distribution over an interval $[a,b]$ to be the probability distribution describing an equally likely chance for any outcome on $[a,b]$. Other than by mathematical definition, how can I see that the avg/mean/expected value is $\frac{a+b}{2}$? Isn't the mean an indicator of likelihood? Or is it just that 'expected value' is poor terminology for the uniform distribution?

For example: if I had a probability distribution shaped like an upward parabola (over a relatively large interval, where the vertex was a the center of the distribution, then the mean would be dead center of the interval but a very inaccurate indicator of what value you could expect to see during an experiment. Is this right?

I'm looking for some intuition here, because it doesn't make sense to me that some process can be completely random yet have an average value, especially when I think of an extremely large $|b-a|$.

Best Answer

Let's take the example of rolling a die. This has a uniform distribution over the discrete set $\{1,2,3,4,5,6\}$. The mean then is the average value (not the most frequent value) you expect to get over many rolls:

$$\frac{1+2+3+4+5+6}{6} = \frac{21}{6} = \frac{7}{2} = 3.5$$

You never roll a $3.5$, but it is the "expected value".

Moving to the analogous continuous case over the interval $[1,6]$, we have the mean as

$$\frac{1+6}{2} = \frac{7}{2} = 3.5$$

Related Question