[Math] Method of Moments and Maximum Likelihood question

probabilitystatistics

Suppose that $X_1,X_2,…,X_n$ are an i.i.d. random sample from a
Rayleigh distribution with parameter $\theta > 0, f(x|\theta) = \frac{x}{\theta^2}e^{-\frac{x^2}{2\theta^2}}, x>=0$

Find the method of moments estimator for θ, and find the maximum likelihood estimator for θ. Find the asymptotic variance of the MLE for θ.

Best Answer

The mean of a raleigh distribution is : $\theta \sqrt{\frac{\pi}{2}}$ hence, the method of moments estimator of $\theta$ is simply $\bar X \sqrt{\frac{2}{\pi}} $

For the MLE: you need to multply N copies of the density together (one per data point), then maximize the likelihood by setting the derivative of the lieklihood to zero and solving for $\theta$. You should get the answer here.

Related Question