I think you misscalculate the loglikelihood:
$$L=\prod_{i=1}^{n}(1-\Theta)^{x_i-1}\Theta =$$
$$=(1-\Theta)^{\sum_{i=1}^{n}x_i-n}\cdot \Theta^n$$
Then you calculate $lnL$
$$lnL=(\sum_{i=1}^{n}x_i-n)ln(1-\Theta)+nln\Theta$$
$$\frac{\partial lnL}{\partial \Theta}=\frac{n}{\Theta}-\frac{\sum_{i=1}^{n}x_i-n}{1-\Theta}$$
The second derivate is th folowing:
$$\frac{\partial^2lnL}{\partial^2 \Theta}=-\frac{n}{\Theta^2}-\frac{\sum_{i=1}^{n}x_i-n}{(1-\Theta)^2}$$
For the Fisher information you need $$-E\bigg(\frac{\partial^2lnL}{\partial^2 \Theta}\bigg)$$
$$E\bigg(-\frac{n}{\Theta^2}-\frac{(\sum_{i=1}^{n}x_i-n}{(1-\Theta)^2}\bigg)$$
In this formation the onli variable is in $$\sum_{i=1}^{n}x_i$$ all others are constans. $$E(const)=const$$ so you can get the folowing:
$$E\bigg(-\frac{n}{\Theta^2}-\frac{(\sum_{i=1}^{n}x_i-n}{(1-\Theta)^2}\bigg)=$$
$$\bigg(-\frac{n}{\Theta^2}-\frac{1}{\big(1-\Theta)^2}\big(E(\sum_{i=1}^{n}x_i\big)-n\big)\bigg)=$$
Well known as if $x_i$ is geometrical then $E(x_i)=\frac{1}{\Theta}$
Because all $x_i$ are independent so $E(\sum_{i=1}^{n}x_i)=n \cdot \frac{1}{\Theta}$
So the final formation is:
$$=\bigg(-\frac{n}{\Theta^2}-\frac{1}{\big(1-\Theta)^2}\big(\frac{n}{\Theta}-n\big)\bigg)$$
So Finaly you get the Fisher information:
$$F_{\Theta}=-E\bigg(-\frac{n}{\Theta^2}-\frac{(\sum_{i=1}^{n}x_i-n}{(1-\Theta)^2}\bigg)=n\big(\frac{1}{\Theta^2}+\frac{1}{(1-\Theta)\Theta}\big)$$
Yes it's correct. Very well done.
This doesn't simplify the work a lot in this case, but here's an interesting result . . . In the case of $n$ i.i.d. random variables $y_1,\dots,y_n$ , you can obtain the Fisher information $i_{\vec y}(\theta)$ for $\vec y$ via $n \cdot i_y (\theta$) where $y$ is a single observation from your distribution.
Here $\ell(\theta) = \ln( \frac{1}{\theta} e^{-y/\theta}) = -y/\theta - \ln(\theta) \implies \frac{\partial}{\partial \theta} \ell (\theta) = \frac{y}{\theta^2} - \frac{1}{\theta}
\implies \frac{\partial^2}{\partial \theta^2} \ell(\theta) = - \frac{2y}{\theta^3} + \frac{1}{\theta^2}$
\begin{align*}
i_y(\theta) &= - E \left[ \frac{\partial^2}{\partial \theta^2} \ell(\theta) \right] = -E \left[ - \frac{2y}{\theta^3} + \frac{1}{\theta^2} \right] = \dfrac{2 \theta}{\theta^3} - \dfrac{1}{\theta^2} = \dfrac{1}{\theta^2}
\end{align*}
and multiplying by $n$ gives Fisher information $n/\theta^2$.
Best Answer
Hint: note that the given distribution is a known law: a lognormal distribution
Then the MLE estimations can be obtained immediately and without any calculation, deriving them form the gaussian model.