[Math] Maximum Likelihood Estimate of the Uniform Distribution

bayesianmaximum likelihoodprobabilityuniform distribution

I am trying to find the Maximum Likelihood estimate of the $a$ parameter in the uniform distribution.

For a quick background, so far I am familiar with Bayesian models which use the prior probability times the likelihood of the data set, normalized, which will give the posterior distribution. In the past, I've seen the following:

$p(\theta)$ is the prior probability distribution

$p(D\mid\theta)$ is the likelihood of the training data set $D$

$p(\theta\mid D)$ is the posterior distribution which is proportional to $[p(\theta) \times p(D\mid\theta)]$

And we use the Beta distribution for the prior, and the Binomial distribution for the likelihood (which is convenient because we use the log likelihood to find the MLE).

But now, we are being asked to find the maximum likelihood of the Uniform distribution.

The distribution formula is: $ \operatorname{Unif}(x\mid a,b) = \frac{1}{2a}I(x \in [-a, a])$

where $I(\text{true}) = 1$ and $I(\text{false}) = 0$

Our data set is $D=\{x_1, \ldots , x_n\}$

My question is this: how do we find the MLE of the parameter $a$?

Thanks in advance

Best Answer

The density for each observation is $\displaystyle f_{X_i}(x) = \begin{cases} 1/2a & \text{if } -a\le x\le a, \\ 0 & \text{if } x<-a \text{ or } x>a. \end{cases}$

You didn't say your observations were independent but I will assume that was intended. The joint density is therefore $$ f_{X_1,\ldots,X_n} (x_1,\ldots,x_n) = \begin{cases} 1/(2a)^n & \text{if for every } i\in\{1,\ldots,n\} \text{ we have } -a\le x_i \le a, \\ 0 & \text{otherwise}. \end{cases} $$

The condition that for every $i\in\{1,\ldots,n\}$ we have $-a\le x_i\le a$ is the same as $\min\{x_1,\ldots,x_n\} \ge -a$ and $\max\{x_1,\ldots,x_n\}\le a.$ That condition on $\min$ is the same as $-\min\{x_1,\ldots,x_n\} \le a.$ So we need $a\ge\max$ and $a\ge -\min.$ I leave it as an exercise to show that $$ \Big( a\ge \max\{x_1,\ldots,x_n\} \text{ and } a\le -\min\{x_1,\ldots,x_n\} \Big) \text{ if and only if } a \ge \max\{|x_1|,\ldots,|x_n|\}. $$

Therefore the likelihood function is $$ L(a) = \begin{cases} 1/(2a)^n & \text{if } a \ge \max\{|x_1|,\ldots,|x_n|\}, \\ 0 & \text{otherwise.} \end{cases} $$ Now notice that $L(a)$ increases as $a$ decreases, until $a$ gets down to $\max\{|x_1|,\ldots,|x_n|\}.$ Therefore that maximum is the MLE.