You have $X_1,\ldots,X_{50} \sim\operatorname{i.i.d. Poisson}(\lambda),$ so $\Pr(X_1=0) = e^{-\lambda}.$
The number of successes in $50$ trials with probability $e^{-\lambda}$ of success on each trial has a binomial distribution. So one has
\begin{align}
L(\lambda) & = \Pr(\text{exactly 20 such โsuccesses'' in 50 trials)} \\[10pt]
& = \binom{50}{20} \left( e^{-\lambda} \right)^{20} \left( 1 - e^{-\lambda}\right)^{30}.
\end{align}
We seek the value of $\lambda$ that minimizes that. Letting $p= e^{-\lambda},$ we can find the value of $p$ that minimizes the expression above, by writing
\begin{align}
\log L(\lambda) & = 20 \log p + 30\log(1-p) \\[10pt]
\frac d{dp} \log L(\lambda) & = \frac{20} p - \frac{30}{1-p} = \frac{50((20/50) - p)}{p(1-p)} =\begin{cases} >0 & \text{if } 0 \le p < 20/50 \\ <0 & \text{if } 20/50 < p \le 1. \end{cases}
\end{align}
First rewrite the density with the new parametrization
$$f(y|\theta)=\frac{ky^{k-1}}{\theta}e^{-\frac{y^k}{\theta}}$$
Calculate the likelihood
$$L(\theta)\propto \theta^{-n}e^{-\frac{\Sigma_i y_i^k}{\theta}}$$
proceeding in the calculation you find that the score function (derivative of the log likelihood with respect to $\theta$) is
$$l^*=-\frac{n}{\theta}+\frac{1}{\theta^2}\Sigma_i y_i^k$$
And thus
$$T=\hat{\theta}_{ML}=\frac{\Sigma_i y_i^k}{n}$$
To show that $\mathbb{E}[T]=\theta$ let's rewrite the score function in the following way
$$l^*=-\frac{n}{\theta}+\frac{nT}{\theta^2}$$
Now simply remembering that (First Bartlett Identity)
$$\mathbb{E}[l^*]=0$$
you get
$$\frac{n}{\theta}=\frac{n\mathbb{E}[T]}{\theta^2}$$
that is also
$$\mathbb{E}[T]=\theta$$
To calculate its variance, using II Bartlett Identity, that is
$$\mathbb{E}[l^{**}]=-\mathbb{E}[(l^*)^2]$$
This identity leads to
$$\mathbb{V}\Bigg[\frac{nT}{\theta^2}-\frac{n}{\theta}\Bigg]=-\mathbb{E}\Bigg[\frac{n}{\theta^2}-\frac{2nT}{\theta^3}\Bigg]$$
that is
$$\frac{n^2}{\theta^4}\mathbb{V}[T]=\frac{n}{\theta^2}$$
$$\mathbb{V}[T]=\frac{\theta^2}{n}$$
Alternative method to calculate expectation and variance of T
Simply transforming
$$W=Y^k$$
you get that $W\sim Exp\Big(\frac{1}{\theta}\Big)$ thus
$$T\sim Gamma\Big(n;\frac{n}{\theta}\Big)$$
thus immediately you get
$$\mathbb{E}[T]=\frac{n}{\frac{n}{\theta}}=\theta$$
$$\mathbb{V}[T]=\frac{n}{\Big(\frac{n}{\theta}\Big)^2}=\frac{\theta^2}{n}$$
Best Answer
Note that $ y_i \sim N(\theta_0 + \theta_1 x_i ^ 2 + \theta_2 e^{x_i}, \sigma^2) $, thus the likelihood function is $$ L(y_1,...,y_n | \theta_0, \theta_1, \theta_2) = \frac{1}{(2 \pi \sigma^2)^{n/2}}\exp\left\{- \frac{\sum_{i=1}^n(y_i - \theta_0 - \theta_1 x_i ^ 2 - \theta_2 e^{x_i}) ^ 2}{2 \sigma^2}\right\}, $$ where maximizing $ L(y_1,...,y_n | \theta_0, \theta_1, \theta_2) $ is equivalent to minimizing $ \sum_{i=1}^n(y_i - \theta_0 - \theta_1 x_i ^ 2 - \theta_2 e^{x_i}) ^ 2 $, which is the usual ordinary least squares problem that have the following solution, $$ \hat{\theta} = (X'X)^{-1}X'y, $$ where $X$ is $n \times 3$ the design matrix, where its $i$th row is $ (1, x_i^2, e^{x_i} )$.