In a reply concerning interpreting probability densities I have argued for the merits of including the differential terms ($dx$ and $dy$ in this case) in the PDF, so let's write
$$f(x,y; \theta,\lambda) = \frac{1}{\lambda x \sqrt{2\pi}} \exp\left(-\frac{(y-x\theta)^2}{2x^2} - \frac{x}{\lambda}\right)\,dx\,dy.$$
In (d), attention is focused on $\hat{\theta}$ which appears to be a multiple of $\sum_i y_i/x_i$ where each $(x_i,y_i)$ is an independent realization of the bivariate random variable described by $f$. To tackle this, let's consider the distribution of $Z=Y/X$--and then we will later have to sum a sequence of independent versions of this variable in order to determine the distribution of $\hat\theta$.
In contemplating the change of variable $(X,Y)\to (X,Z) = (X,Y/X)$, we recognize that the non-negativity of $X$ assures this induces an order-preserving, one-to-one correspondence between $Y$ and $Z$ for each $X$. Therefore all we need to do is change the variable in the integrand, writing $y=x z$:
$$f(x,x z; \theta,\lambda) = \frac{1}{\lambda x \sqrt{2\pi}} \exp\left(-\frac{(x z-x\theta)^2}{2x^2} - \frac{x}{\lambda}\right)\,dx\,d(x z).$$
From the product rule $d(x z) = z dx + x dz$, and understanding the combination $dx\, d(x z)$ in the sense of differential forms $dx \wedge d(x z)$, we mechanically obtain
$$dx\, d(x z) = dx \wedge d(x z) = dx \wedge (z dx + x dz) = z dx \wedge dx + x dx \wedge dz = x\,dx\,dz.$$
(This is the easy way to obtain the Jacobian determinant of the transformation.)
Therefore the distribution of $(X,Z)$ has PDF
$$\eqalign{
g(x,z;\theta,\lambda) = f(x, x z; \theta, \lambda) &= \frac{1}{\lambda x \sqrt{2\pi}} \exp\left(-\frac{(x z-x\theta)^2}{2x^2} - \frac{x}{\lambda}\right)\,x dx\,dz \\
&= \frac{1}{\lambda} \exp\left( - \frac{x}{\lambda}\right)\frac{1}{\sqrt{2\pi}}\,dx\ \exp\left(-\frac{( z-\theta)^2}{2}\right) \,dz.
}$$
Because this has separated into a product of a PDF for $X$ and a PDF for $Z$, we see without any further effort that (1) $X$ and $Z$ are independent and (2) $Z$ has a Normal$(\theta,1)$ distribution.
Finding the joint distribution of $(\hat{\lambda}, \hat\theta)$, which are easily expressed in terms of $X$ and $Z$, is now straightforward.
Best Answer
I think your likelihood fucntion is wrong, for your Beta distrbution, the $pdf$ is
$$f(y)=\frac{\Gamma(1+\theta)}{\Gamma(1)\Gamma(\theta)}(1-y)^{\theta-1}$$ The likelihood function will be
$L(\theta)=\frac{\Gamma(1+\theta)}{\Gamma(1)\Gamma(\theta)}(1-y_1)^{\theta-1}\frac{\Gamma(1+\theta)}{\Gamma(1)\Gamma(\theta)}(1-y_2)^{\theta-1}...\frac{\Gamma(1+\theta)}{\Gamma(1)\Gamma(\theta)}(1-y_n)^{\theta-1}\\= (\frac{\Gamma(1+\theta)}{\Gamma(1)\Gamma(\theta)})^n\left [ \prod_{i=1}^n(1-y_i)\right]^{\theta-1}$
Now take the log
$l(\theta)=nlog(\frac{\Gamma(1+\theta)}{\Gamma(1)\Gamma(\theta)})+(\theta-1)\sum_{i=1}^nlog(1-y_i)$
I will not go ahead from here.