Solved – Maximum likelihood estimator, exact distribution

distributionsmaximum likelihood

$$\frac{e^{-(y-θx)^2/2x^2) -x/λ}}{(λ\sqrt{2πx^2})}$$

This is the joint distribution function.

a)I have to find the marginal function of $X$ and $Y|X$.
Now the $X\sim \text{Exp}(1/λ)$ and $Y|X\sim N(θx;x^2)$

b)Then, find the MLE for $λ$ and $θ$.
For $λ$ we have:
$\cal{L}= \prod \left[e^{(\frac{-x}{\lambda})}/\lambda\right] =[1/λ]^n e^{(-∑ 〖x/λ〗)}$

$\log(\cal{L})= -\log\lambda^n-(\sum x)/\lambda $

$\frac{dl}{d\lambda}= -n/\lambda+\sum x/\lambda^2 =0$

$\hat \lambda =\sum x/n $

and for $θ$:
$\cal{L}=\prod [\frac{1}{\sqrt{2\pi}}\frac{1}{x} e^{(〖-(y-\theta x)〗^2/ 2x^2 )} ] $

$=1/\prod x\cdot e^{\sum \frac{〖-(y-θx)〗^2}{2x^2}}$

$\log⁡(\cal{L})=-\sum \log ⁡x – \sum y^2/〖2x〗^2 + \sum \theta y/x- \theta^2/2$

I derive $\theta$ and resolve

$\hat\theta = \sum y/x$

c) Are they unbiased?

$E(\hat{\lambda})=\lambda$. For $\theta$ do I have to use the law of iterated expectation?

I have the same question for the variance of $\theta$. We know that for large n we have asymptotically unbiased mle.

d)Exact distribution of $\lambda$ and $\theta$? Asymptotically the distribution for $\theta$ and $\lambda$ is normal, but is it possible to find the exact distribution for theta? (for $\lambda$ it is a $\text{gamma}(n,n/\lambda)$, right?)

I have to try with the transformation law, given that x is a function of y. I have many doubts here!

Best Answer

In a reply concerning interpreting probability densities I have argued for the merits of including the differential terms ($dx$ and $dy$ in this case) in the PDF, so let's write

$$f(x,y; \theta,\lambda) = \frac{1}{\lambda x \sqrt{2\pi}} \exp\left(-\frac{(y-x\theta)^2}{2x^2} - \frac{x}{\lambda}\right)\,dx\,dy.$$

In (d), attention is focused on $\hat{\theta}$ which appears to be a multiple of $\sum_i y_i/x_i$ where each $(x_i,y_i)$ is an independent realization of the bivariate random variable described by $f$. To tackle this, let's consider the distribution of $Z=Y/X$--and then we will later have to sum a sequence of independent versions of this variable in order to determine the distribution of $\hat\theta$.

In contemplating the change of variable $(X,Y)\to (X,Z) = (X,Y/X)$, we recognize that the non-negativity of $X$ assures this induces an order-preserving, one-to-one correspondence between $Y$ and $Z$ for each $X$. Therefore all we need to do is change the variable in the integrand, writing $y=x z$:

$$f(x,x z; \theta,\lambda) = \frac{1}{\lambda x \sqrt{2\pi}} \exp\left(-\frac{(x z-x\theta)^2}{2x^2} - \frac{x}{\lambda}\right)\,dx\,d(x z).$$

From the product rule $d(x z) = z dx + x dz$, and understanding the combination $dx\, d(x z)$ in the sense of differential forms $dx \wedge d(x z)$, we mechanically obtain

$$dx\, d(x z) = dx \wedge d(x z) = dx \wedge (z dx + x dz) = z dx \wedge dx + x dx \wedge dz = x\,dx\,dz.$$

(This is the easy way to obtain the Jacobian determinant of the transformation.)

Therefore the distribution of $(X,Z)$ has PDF

$$\eqalign{ g(x,z;\theta,\lambda) = f(x, x z; \theta, \lambda) &= \frac{1}{\lambda x \sqrt{2\pi}} \exp\left(-\frac{(x z-x\theta)^2}{2x^2} - \frac{x}{\lambda}\right)\,x dx\,dz \\ &= \frac{1}{\lambda} \exp\left( - \frac{x}{\lambda}\right)\frac{1}{\sqrt{2\pi}}\,dx\ \exp\left(-\frac{( z-\theta)^2}{2}\right) \,dz. }$$

Because this has separated into a product of a PDF for $X$ and a PDF for $Z$, we see without any further effort that (1) $X$ and $Z$ are independent and (2) $Z$ has a Normal$(\theta,1)$ distribution.

Finding the joint distribution of $(\hat{\lambda}, \hat\theta)$, which are easily expressed in terms of $X$ and $Z$, is now straightforward.

Related Question