[Math] Finding MLE for a discrete distribution

maximum likelihood

Given discrete random variables $X_1, \ldots X_n$ with probability mass function:
$$
f(x;\theta) = \begin{cases}
\theta & x= -1 \\
(1-\theta)^2 \theta^x & x = 0, 1, \ldots \\
\end{cases}
$$

Prove that the MLE of $\theta$ is:
$$
\hat{\theta}_n = \frac{2\sum_{i=1}^{n}I_{(X_i=-1)} + \sum_{i=1}^{n}X_i}{2n + \sum_{i=1}^{n}X_i}
$$

My attempt is as follows:

Let $Y = \mathbb{1}_{[X_i=-1]}$. The likelihood function can be set up as:
$$
L(\theta|x) = \Pi_{i=1}^{n}f(x_i;\theta)= \theta^{\sum_{i=1}^{n}Y}\cdot\theta^{\sum_{i=1}^{n}\left((1-Y)X_i\right)}\cdot(1-\theta)^{2\sum_{i=1}^{n}(1-Y)}
$$

Then the log-likelihood is:
$$
\begin{align}
l(\theta|x) &= \log\Pi_{i=1}^{n}f(x_i;\theta) = \log(\theta^{\sum_{i=1}^{n}Y}\cdot\theta^{\sum_{i=1}^{n}\left((1-Y)X_i\right)}\cdot(1-\theta)^{2\sum_{i=1}^{n}(1-Y)})\\
&= \log(\theta) \sum_{i=1}^{n}Y + \log(\theta)\sum_{i=1}^{n}\left((1-Y)X_i\right) + \log(1-\theta)\cdot2\sum_{i=1}^{n}(1-Y)
\end{align}
$$

Solving by direct maximization, $\frac{d}{d\theta}l(\theta|x) = 0$:
$$
\frac{\sum_{i=1}^{n}Y}{\theta} + \frac{\sum_{i=1}^{n}(1-Y)X_i}{\theta} – \frac{2\sum_{i=1}^{n}(1-Y)}{1-\theta} = 0
$$

By solving at the end I arrive at my MLE candidate being:
$$
\hat{\theta}_n = \frac{\sum_{i=1}^{n}\mathbb{1}_{[X_i=-1]} + \sum_{i=1}^{n}X_i – \sum_{i=1}^{n}\mathbb{1}_{[X_i=-1]}X_i}{2n+\sum_{i=1}^{n}X_i – \sum_{i=1}^{n}\mathbb{1}_{[X_i=-1]} – \sum_{i=1}^{n}\mathbb{1}_{[X_i=-1]}X_i}
$$

Which is close to the solution (I know I still should check if it's the global maximum). I can't see if I am making any algebraic mistake or the likelihood function is not correctly defined. Any help or hint appreciated!

Best Answer

I don't think you have made any mistakes, and you are very close to the solution.

Note that $\sum_{i=1}^{n}\mathbb{1}_{[X_i=-1]}X_i=-\sum_{i=1}^{n}\mathbb{1}_{[X_i=-1]}$ - substituting this in your equation will result in the solution.