Let $\mathcal{K}_X(t)=\log\left(\Bbb{E}\left(\operatorname{e}^{tX}\right)\right)$ the cumulant generating function of $X\sim \mathcal{IG}(\mu,\lambda)$.
The pdf of $X\sim \mathcal{IG}(\mu,\lambda)$ is
$$
\begin{align}
f(x;\mu,\lambda) &= \left(\frac{\lambda}{2 \pi x^3}\right)^{1/2} \exp\left({\frac{-\lambda (x-\mu)^2}{2 \mu^2 x}} \right)\qquad x>0\tag 1
\end{align}
$$
that can be rewritten as
$$
\begin{align}
f(x;\mu,\lambda) &= \left(\frac{\lambda}{2 \pi x^3}\right)^{1/2} \exp\left({-\frac{\lambda}{2\mu^2} x+\frac{\lambda}{\mu}-\frac{\lambda}{2x}} \right)\qquad x>0\tag 2
\end{align}
$$
so the cumulant generating function, using the (2), is
$$
\begin{align}
\mathcal{K}_X(t)&=\log\left(\Bbb{E}\left(\operatorname{e}^{tX}\right)\right)=\log\left(\int_0^{+\infty}\operatorname{e}^{tX}f(x;\mu,\lambda)\operatorname{d}x\right)\\
&= \frac{\lambda}{\mu}+\log\left(\int_0^{+\infty}\exp\left({-\frac{\lambda}{2\mu^2} x+tx-\frac{\lambda}{2x}} \right)\left(\frac{\lambda}{2 \pi x^3}\right)^{1/2} \operatorname{d}x\right)\\
&=\frac{\lambda}{\mu}+\log\left(\underbrace{\int_0^{+\infty}\exp\left({-\left[\frac{1}{2\mu^2}-\frac{t}{\lambda}\right]\lambda x-\frac{\lambda}{2x}} \right)\left(\frac{\lambda}{2 \pi x^3}\right)^{1/2} \operatorname{d}x}_{\mathcal{I}}\right)
\end{align}
$$
Putting $\beta^2=2\lambda^2\left(\frac{1}{2\mu^2}-\frac{t}{\lambda}\right)$ and substituting $y=\left(\frac{1}{2\mu^2}-\frac{t}{\lambda}\right)\lambda x=\frac{\beta^2}{2\lambda}x$, the integral $\mathcal{I}$ becomes
$$
\mathcal{I}=\frac{\beta}{2\sqrt \pi}\int_0^{+\infty}\exp\left({-y-\frac{\beta^2}{4y}}\right) \frac{\operatorname{d}y}{y^{\frac{1}{2}+1}}
$$
and recalling the integral representation of the modified Bessel function $K_{\nu}(z)$
$$
K_{\nu}(z)=\frac{1}{2}\left(\frac{z}{2}\right)^{\nu}\int_0^{+\infty}\exp\left({-t-\frac{z^2}{4t}}\right) \frac{\operatorname{d}t}{t^{\nu+1}}
$$
we have
$$
\mathcal{I}=\frac{\beta}{2\sqrt \pi}\cdot 2\left(\frac{2}{\beta}\right)^{\frac{1}{2}}K_{\frac{1}{2}}(\beta)=\operatorname{e}^{-\beta}
$$
using the identity $K_\frac{1}{2}(z)=\left(\frac{\pi}{2z}\right)^{\frac{1}{2}} \mathrm{e}^{-z},\, z>0$.
Thus we have
$$
\begin{align}
\mathcal{K}_X(t)&=\frac{\lambda}{\mu}+\log (\operatorname{e}^{-\beta})=\frac{\lambda}{\mu}-\left[2\lambda^2\left(\frac{1}{2\mu^2}-\frac{t}{\lambda}\right)\right]^{\frac{1}{2}}
\end{align}
$$
and finally
$$
\mathcal{K}_X(t)=\frac{\lambda}{\mu}\left[1-\left(1-\frac{2\mu^2 t}{\lambda}\right)^{\frac{1}{2}}\right].\tag 3
$$
Observing that if $X\sim \mathcal{IG}(\mu,\lambda)$, then $\frac{X}{n}\sim \mathcal{IG}\left(\frac{\mu}{n},\frac{\lambda}{n}\right)$ and if $X_1,\ldots,X_n$ aree iid $\mathcal{IG}(\mu,\lambda)$ then the cumulant generating function of $n\bar{X}=\sum_{i=1}^n X_i$ is the sum of the cumulant generating of each $X_i$:
$$
\mathcal{K}_{\bar{X}}(t)=\sum_{i=1}^n\mathcal{K}_{X_i/n}(t)=\frac{n\lambda}{\mu}\left[1-\left(1-\frac{2\mu^2 t}{n\lambda}\right)^{\frac{1}{2}}\right]\tag 4
$$
that is the cumulant generating function of an inverse gaussian distribution with mean $\mu$ and shape $n\lambda$:
$$
\bar{X}=\frac{1}{n}\sum_{i=1}^n X_i \sim \mathcal{IG}(\mu,n\lambda).\tag 5
$$
Recall a useful result for exponential families.
If $X_1,\ldots, X_n$ are independently and identically distributed (iid) with the common density
$$
p(x;\theta, \eta) = h(x, \eta) \operatorname{e}^{\theta T(x) − A(\theta, \eta)}\tag 6
$$
and if the density of $U=U(X_1,\ldots, X_n)$ is given by
$$
q(u;\theta, \eta) = h^*(u, \eta) \operatorname{e}^{n\theta T(u) − A^*(\theta, \eta)}\tag 7
$$
for some $A^∗$ and $h^∗$, then $U$ is independent of $V = \sum_{i=1}^n T(X_i)−nT(U)$.
Observing that if $X\sim \mathcal{IG}(\mu,\lambda)$, then $Y=\frac{X}{\mu}\sim \mathcal{IG}\left(\frac{\mu}{\mu},\frac{\lambda}{\mu}\right)=\mathcal{IG}\left(1,\phi\right)$ with $\phi=\frac{\lambda}{\mu}$ and the pdf of $\frac{X}{\mu}$ is
$$
\begin{align}
f(y;1,\phi) &= \left(\frac{\phi}{2 \pi y^3}\right)^{1/2} \exp\left({-\frac{\phi}{2}\left(y+\frac{1}{y}\right)+\phi} \right)\qquad u>0\tag 2
\end{align}
$$
which is of form (6) with $\theta=-\frac{\phi}{2}$ and $T(y)=y+\frac{1}{y}$. Using (5), and putting $Y_i=\frac{X_i}{\mu}$ for $i=1,\ldots,n$, we know that
$$
\bar{Y}=\frac{1}{n}\sum_{i=1}^n Y_i \sim \mathcal{IG}(1,n\phi)
$$
with a density of form (7).
Hence, $\bar{Y}=U(X_1,\ldots,X_n)$ is independent of
$$V= \sum_{i=1}^n \left(Y_i+\frac{1}{Y_i}\right)-n\left(\bar{Y}+\frac{1}{\bar{Y}}\right)=\sum_{i=1}^n \frac{1}{Y_i}-\frac{n}{\bar{Y}}
$$
and finally $\bar{X}=\mu\bar{Y}$ is independent of $Z=\frac{V}{\mu}$, that is
$$
\bar{X}=\frac{1}{n}\sum_{i=1}^n X_i \qquad\text{is independent of }\qquad
Z=\sum_{i=1}^n \left(\frac{1}{X_i}-\frac{1}{\bar{X}}\right)
$$
Note that $\hat{\mu}=\bar{X}$ and $\hat{\lambda}=\frac{n}{Z}$ are the maximum likelihood estimators (MLE) of $\mu$ and $\lambda$.
Of course, there are other ways of proving these independence results; among
which, Basu’s theorem is the most popular. Simply put, this theorem says: if $T$ is a complete sufficient statistic for a family of probability distributions $\mathcal{P}$, and $S$ is an ancillary statistic, then $T$ and $S$ are independent.
As $X_1,\,\ldots,X_n$ are independent, the cumulative density $f(X_1,\,\ldots,X_n)$ is the product of the marginal densities of $X_1,\,\ldots,X_n$. Observe that
$$
\Bbb{E}(g(X_1,\dots,X_n)) = \int_{-\infty}^\infty\cdots \int_{-\infty}^\infty g(x_1,\ldots,x_n)~f(x_1,\ldots,x_n)~\mathrm{d}x_1\cdots \mathrm{d}x_n .
$$
The moment generating function of $Y=\frac{Z}{n}=$ is then
$$
\begin{align}
\Bbb{E}\left(\operatorname{e}^{tY}\right)&=\idotsint_{\text{all }x_i>0}\operatorname{e}^{tY}\left(\prod_{i=1}^n f(x_i;\mu,\lambda)\operatorname{d}x_i\right)
\end{align}
$$
Observing that
$$
\begin{align}
\prod_{i=1}^n f(x_i;\mu,\lambda)&=\left[\prod_{i=1}^n \left(\frac{\lambda}{2 \pi x_i^3}\right)^{1/2}\right] \exp\left({-\frac{\lambda}{2\mu^2}\sum_{i=1}^n x_i+\sum_{i=1}^n\frac{\lambda}{\mu}-\sum_{i=1}^n\frac{\lambda}{2x_i}} \right)\\
&=\left(\frac{\lambda}{2 \pi}\right)^{n/2}\left[\prod_{i=1}^n x_i^{-3/2} \right] \exp\left(-\frac{n\lambda}{2\mu^2}\bar{x}+\frac{n\lambda}{\mu}-\frac{n\lambda}{2\bar{x}}- \frac{n\lambda}{2}y\right)\\
&=\left(\frac{\lambda}{2 \pi}\right)^{\frac{n-1}{2}}\bar{x}^{-3/2}n^{-1/2}f(\bar{x};\mu,n\lambda)\operatorname{e}^{-\frac{n\lambda}{2}y}
\end{align}
$$
we have
$$
\begin{align}
\Bbb{E}\left(\operatorname{e}^{tY}\right)&=\idotsint_{\text{all }x_i>0}\operatorname{e}^{\left(t-\frac{n\lambda}{2}\right)y}\left(\frac{\lambda}{2 \pi}\right)^{\frac{n-1}{2}}\bar{x}^{-3/2}n^{-1/2}f(\bar{x};\mu,n\lambda)\left(\prod_{i=1}^n x_i^{-3/2}\operatorname{d}x_i\right)\\
&=\int_{0}^{\infty}f(\bar{x};\mu,n\lambda)\idotsint_{\bar{x}\text{ is constant}}\operatorname{e}^{\left(t-\frac{n\lambda}{2}\right)y}\left(\frac{\lambda}{2 \pi}\right)^{\frac{n-1}{2}}\bar{x}^{-3/2}n^{-1/2}\left(\prod_{i=1}^n x_i^{-3/2}\operatorname{d}x_i\right).
\end{align}
$$
Form the relation $\Bbb{E}\left(U\right)=\Bbb{E}_{V}\left(\Bbb{E}\left(\left.U\right|V\right)\right)$, it follows taht the conditional moment generating function of $Y$ for fixed $\bar{X}$ can be found taking the partial derivative with respect to $\bar x$ of the multiple integral:
$$
\begin{align}
\Bbb{E}\left(\left.\operatorname{e}^{tY}\right|\bar{X}\right)
&=\frac{\partial}{\partial\bar x}\underbrace{\idotsint_{\bar{x}\text{ is constant}}\operatorname{e}^{\left(t-\frac{n\lambda}{2}\right)y}\left(\frac{\lambda}{2 \pi}\right)^{\frac{n-1}{2}}\bar{x}^{-3/2}n^{-1/2}\left(\prod_{i=1}^n x_i^{-3/2}\operatorname{d}x_i\right)}_{\mathcal{J}(t)}.
\end{align}
$$
To evaluate the integral, we take first $t=0$, obtaining
$$
\frac{\partial}{\partial\bar x}\mathcal{J}(0)=n^{-1/2}\left(\frac{2 \pi}{\lambda}\right)^{\frac{n-1}{2}}.
$$
By substituting $\lambda-\frac{2t}{n}$ we find
$$
\Bbb{E}\left(\left.\operatorname{e}^{tY}\right|\bar{X}\right)=\left(1-\frac{2t}{n\lambda}\right)^{-\frac{n-1}{2}}
$$
wich is the moment generating function of a chi square variable with $n-1$ degree of freedom:
$$Y\sim \frac{1}{n\lambda}\chi^2_{n-1}.$$
Finally we have
$$
\frac{n\lambda}{\sum_{i=1}^n \left(\frac{1}{X_i}-\frac{1}{\bar{X}}\right)}\sim \chi^2_{n-1}.
$$
Best Answer
So what you actually have is called a generalised inverse Gaussian distribution (http://en.wikipedia.org/wiki/Generalized_inverse_Gaussian_distribution)
$$ \sqrt\frac{\lambda}{2\pi}e^{\lambda/\mu}\int_0^\infty x^{\frac{-3}{2}}\exp\left[\left(it-\frac{\lambda}{2\mu^2}\right)x-\frac{\lambda}{2x}\right]dx \\ =C\int_0^\infty x^{-\frac{1}{2}-1}\exp\left(-\frac{1}{2}\left(\left(\frac{\lambda}{\mu^2}-2it\right)x+\frac{\lambda}{x}\right)\right)dx $$
Using the normalising constant of a generalised inverse Gaussian and the constants, $C=\sqrt\frac{\lambda}{2\pi}e^{\lambda/\mu}$, $~a=\frac{\lambda}{\mu^2}-2it$, $~b=\lambda$, $~p=-\frac{1}{2}$ above is:
$$C\frac{2\mathcal{K}_p(\sqrt{ab})}{(a/b)^{p/2}}\int_0^\infty \frac{(a/b)^{p/2}}{2\mathcal{K}_p(\sqrt{ab})}x^{p-1}\exp\left(-\frac{1}{2}\left(ax+\frac{b}{x}\right)\right)dx\\ =C\frac{2\mathcal{K}_p(\sqrt{ab})}{(a/b)^{p/2}}=\sqrt\frac{\lambda}{2\pi}e^{\lambda/\mu}\frac{2\mathcal{K}_{1/2}(\sqrt{ab})}{(a/b)^{1/4}}$$
Where $\mathcal{K}_p$ is the modified besssel function of the second kind, and is invariant to positive or negative indeces, furthermore $\mathcal{K}_{1/2}(u)=\sqrt{\frac{\pi}{2u}}\exp(-u)$
I will let you complete the rest