[Math] Maximum-likelihood estimation for continuous random variable with unknown parameter

probabilityprobability theorystatistics

Let $X$ be a random variable with the unknown parameter $\lambda$ and the following pdf
$$f(t)=2\lambda t\cdot\mathrm e^{-\lambda t^2}\cdot\textbf{1}_{[0,\infty)}(t)$$
where $\textbf{1}_A(x)$ is an indicator function with
$$\textbf{1}_A(x)=\begin{cases}1,&\text{if }x\in A,\\0,&\text{else.}\end{cases}$$
Let $\vec x=(x_1,\ldots,x_n)$ be a sample of $X$. Determine the maximum-likelihood estimator $\widehat{\lambda}$ such that the following is true for the likelihood-function $\mathcal L(\vec x;\lambda)$:
$$\forall \lambda\;:\;\mathcal L(\vec x;\lambda)\leq \mathcal L(\vec x;\widehat\lambda)$$

For the sake of simplicity my first thoughts were to get the log-likelihood this way:
$$\mathcal L(\vec x;\lambda)=\prod\limits_{i=1}^nf(x_i)\implies \ln(\mathcal L(\vec x;\lambda))=\sum\limits_{i=1}^n\ln(f(x_i))$$
This is the point where I'm stuck: i don't know how to compute the derivative to maximize the function $$\frac{\mathrm d \ln(\mathcal L(\vec x;\lambda))}{\mathrm d\lambda}\overset{!}{=}0.$$
Any hints on how to derive the sum would be appreciated.

Best Answer

We have $$L(\vec x,\lambda)=\prod_{j=1}^n(2\lambda x_j)\cdot e^{-\lambda x_j^2}=2^n\left(\prod_{k=1}^nx_k\right)\lambda ^n\exp\left(-\lambda \lVert x\rVert^2\right)\chi_{x_j\geq 0\forall j},$$ hence assuming the $x_j> 0$. $$\log L(\vec x,\lambda)=n\log 2+\sum_{j=1}^n\log x_j+n\log\lambda-\lambda\lVert x\rVert^2.$$ Now taking the derivative with respect to $\lambda$, we get $$\partial_{\lambda}\log L(\vec x,\lambda)=\frac n{\lambda}-\lVert x\rVert^2.$$ I let you finish the computation.

Related Question