MLE of $\theta$ in $U[0,\theta]$ distribution where the parameter $\theta$ is discrete

maximum likelihoodprobability distributionsstatisticsuniform distribution

Consider i.i.d random variables $X_1,X_2,\ldots,X_n$ having the $U[0,\theta]$ distribution: $$f_{\theta}(x)=\frac{\mathbf1_{[0,\theta]}(x)}{\theta}$$

, where the unknown parameter $\theta\in\{1,2,\ldots\}$.

What can I say about the maximum likelihood estimator (MLE) of $\theta$?

Usually the parameter space is $\mathbb R^{+}$ in which case the MLE is known to be $$X_{(n)}=\max\{X_1,X_2,\ldots,X_n\}$$

, but here the parameter space is restricted to the natural numbers.

In any case, the likelihood function given the sample $x_1,\ldots,x_n$ is

$$L(\theta)=\frac{\mathbf1_{[0,\theta]}(x_1,\ldots,x_n)}{\theta^n}=\frac{\mathbf1_{[x_{(n)},\infty)}(\theta)}{\theta^n}\qquad,\,\theta\in\mathbb N$$

So I should check the values of $L(\theta)$ for each $\theta\in\{1,2,\ldots\}$ and the MLE is that value of $\theta$ for which $L(\theta)$ is maximized. Is this the correct strategy or can I say that MLE of $\theta$ is $\hat\theta=\lfloor X_{(n)}\rfloor$? Or does the MLE exist at all? I am not sure.

Any hint would be helpful.

Best Answer

Given a sample $x\equiv \{x_i\}_{i=1}^n$, the likelihood is $$ L(\theta\mid x)=\theta^{-n}1\{\theta\ge M(x),m(x)\ge 0\}, $$ where $M(x):=\lceil\max_{1\le i\le n}x_i\rceil$ and $m(x):=\min_{1\le i\le n}x_i$. The indicator suggests that $\hat{\theta}_n(x)\ge M(x)$ ($\because$ $L=0$ otherwise). However, taking values larger than $M(x)$ decreases $L$ because of the first term (assuming that $m(x)>0$). Thus, $\hat{\theta}_n(x)= M(x)$.