Maximum likelihood estimator for three parameters

maximum likelihoodparameter estimationregressionstatistics

Assume we have $n$ distinct points $x_1,x_2,…,x_n\in \Bbb{R}$, and observations which are independent random variables $y_1,y_2,…,y_n$ depending on formula:
$$
y_i=\theta_0+\theta_1x_i^2+\theta_3e^{x_i}+\varepsilon_i
$$

where $\varepsilon_i$ are distributed as $N(0,\sigma^2)$. Derive the maximum likelihood estimator for $\theta_{0},\theta_{1},\theta_{2}$ .

I know how to find the MLE for a known distribution like normal or Poisson, but for this question, I don't how I can find the likelihood function, should I consider it as normal distribution? And I don't know what the mean and variance would be since I can't figure out the distribution and likelihood function.

Also, should I consider $\varepsilon_i$ as a useful variable, or just treat it as an error in estimation? Thanks for any help.

Best Answer

Note that $ y_i \sim N(\theta_0 + \theta_1 x_i ^ 2 + \theta_2 e^{x_i}, \sigma^2) $, thus the likelihood function is $$ L(y_1,...,y_n | \theta_0, \theta_1, \theta_2) = \frac{1}{(2 \pi \sigma^2)^{n/2}}\exp\left\{- \frac{\sum_{i=1}^n(y_i - \theta_0 - \theta_1 x_i ^ 2 - \theta_2 e^{x_i}) ^ 2}{2 \sigma^2}\right\}, $$ where maximizing $ L(y_1,...,y_n | \theta_0, \theta_1, \theta_2) $ is equivalent to minimizing $ \sum_{i=1}^n(y_i - \theta_0 - \theta_1 x_i ^ 2 - \theta_2 e^{x_i}) ^ 2 $, which is the usual ordinary least squares problem that have the following solution, $$ \hat{\theta} = (X'X)^{-1}X'y, $$ where $X$ is $n \times 3$ the design matrix, where its $i$th row is $ (1, x_i^2, e^{x_i} )$.

Related Question