Solved – MLE for joint distribution

distributionsjoint distributionmaximum likelihoodprobability

I have a joint distribution of $(X,Y)$ where $Y$ is Bernoulli with $P(Y=1)=p=1-P(Y=0)$. The conditional distribution of $X$ given $Y=y$ is Normal with mean $\mu_y$ and variance $\sigma^2_y$: that is, a different mean and variance for each group.

They (therefore) have the following joint density:

$$f(x,y)=p^y(1-p)^{1-y}\frac{1}{\sqrt{2\pi\sigma^2_y}}\exp\left(-\frac{1}{2 \sigma^2_y}(x-\mu_y)^2\right).$$

How do I find the MLE for this model? The MLE of the $p$ parameter should just be the MLE of Bernoulli right?

Edit:

So I can see now that y is observed, so we have for instance:

$A = \{i \in \{1, \ldots, n\} \mid y_i = 0\}$

$B = \{i \in \{1, \ldots, n\} \mid y_i = 1\}$

logLikelihood:
$$l(\theta)=l(p,\mu_0,\mu_1,\sigma_0,\sigma_1)=\sum_{i=1}^n\log f(x_i,y_i)=\sum_{i \in A}\log f(x_i,y_i)+\sum_{i \in B}\log f(x_i,y_i)$$
Differentiate e.g. w.r.t. $\mu_0$:
$$\frac{\partial l(\theta)}{\partial \mu_0}=\frac{1}{\sigma^2_0}\sum_{i \in A}(x_i-\mu_0)\implies \hat{\mu}_0=\frac{1}{n_A}\sum_{i \in A}x_i$$

Best Answer

Yes. Everything to the right of $p^y (1 - p)^{1 - y}$ is only a constant factor, so as a function of $p$ this whole expression is proportional to the Bernoulli likelihood.

Related Question