Other answers to this question claims that the moment generating function (mgf) of the lognormal distribution do not exist. That is a strange claim. The mgf is
$$\DeclareMathOperator{\E}{\mathbb{E}}
M_X(t) = \E e^{tX}.
$$
And for the lognormal this only exists for $t\le 0$. The claim is then that the "mgf only exists when that expectation exists for $t$ in some open interval around zero. Well, some important theorems about mgf's depend on such an assumption, so the mgf of the lognormal distribution might lack some properties guaranteed by such theorems, but still be useful.
The existence of papers about the lognormal mgf do suggest that some useful properties there are! This papers often talk about the Laplace transform not mgf, but that is only a parameter change from $t$ to $-t$.
I will come back here with a more complete answer, but for the moment I will just point to some papers. On the Laplace transform of the Lognormal distribution by Søren Asmussen, Jens Ledet Jensen and Leonardo Rojas-Nandayapa. There is no exact formula for the mgf, but that paper gives good approximations. Laplace Transforms of Probability Distributions and Their Inversions Are Easy on Logarithmic Scales by A. G. Rossberg.
Finally, Accurate Computation of the MGF of the Lognormal Distribution and its Application to Sum of Lognormals by C. Tellambura and D. Senaratne, and the paper Uniform Saddlepoint Approximations and Log-Concave Densities by Jens Ledet Jensen uses saddlepoint approximations for the lognormal as an example.
So much for the mgf not existing! And now I discovered that all this (and more) was stated earlier by Cardinal Existence of the moment generating function and variance. For instance, from what Cardinal proves there, one can conclude that the lognormal do not have exponentially decaying tails. (Which is one of the properties that follows from existence of mgf in an open interval containing zero).
When a variable $X$ has a Normal distribution with mean $\mu$ and standard deviation $\sigma \gt 0,$ we say that $Z=e^X$ has a Lognormal$(\mu,\sigma)$ distribution.
The laws of logarithms show that $\mu$ (an additive location parameter for the Normal family of distributions) determines the scale of $Z.$ Because the skewness of a variable does not depend on its scale, we may take $\mu$ to be any convenient value.
Choosing $\mu=0,$ use the Normal density (which is proportional to the exponential of $-x^2/(2\sigma^2)$) to compute the (raw) $k^\text{th}$ moment of $Z$ via the substitution $y = x - k\sigma^2:$
$$\begin{aligned}
\mu_k(\sigma) &=E\left[Z^k\right] = E\left[\exp(X)^k\right] = E\left[\exp(kX)\right]\\\
&= \frac{1}{\sigma\sqrt{2\pi}}\int_{\mathbb{R}} \exp\left(\frac{1}{2\sigma^2}x^2 + kx\right)\,\mathrm{d}x\\
&= \frac{1}{\sigma\sqrt{2\pi}}\exp\left(k^2\sigma^2/2\right)\int_{\mathbb{R}} \exp\left(\frac{1}{2\sigma^2}x^2 + kx - k^2\sigma^2/2\right)\,\mathrm{d}x\\
&= \frac{1}{\sigma\sqrt{2\pi}}\exp\left(k^2\sigma^2/2\right)\int_{\mathbb{R}} \exp\left(\frac{1}{2\sigma^2}\left[x - k\sigma^2\right]^2\right)\,\mathrm{d}x\\
&= \exp\left(k^2\sigma^2/2\right)\left[\frac{1}{\sigma\sqrt{2\pi}}\int_{\mathbb{R}} \exp\left(\frac{1}{2\sigma^2}y^2\right)\,\mathrm{d}y\right]\\
&= \exp\left(k^2\sigma^2/2\right).
\end{aligned}\tag{*}$$
For $k=1$ this shows the mean is $\exp(\sigma^2/2)$ and from this we may compute the central moments from the Binomial Theorem as
$$\begin{aligned}
\mu^\prime_k(\sigma) &= E\left[(Z - E[Z])^k\right] = E\left[\sum_{i=0}^k \binom{k}{i} Z^i E(Z)^{k-i}\right] \\
&= \sum_{i=0}^k \binom{k}{i}(-1)^{i-k} \mu_i(\sigma) \mu_1(\sigma)^{k-i}
\end{aligned}.\tag{**}$$
Applying this to $k=2,3$ gives
$$\mu^\prime_2(\sigma) = \mu_0(\sigma)\mu_1(\sigma))^2 - 2\mu_1(\sigma)\mu_1(\sigma) + \mu_2(\sigma) = e^{\sigma^2}\left(e^{\sigma^2}-1\right)$$
and
$$\begin{aligned}\mu^\prime_3(\sigma) &= -\mu_0(\sigma)\mu_1(\sigma)^3 + 3\mu_1(\sigma)\mu_1(\sigma)^2 - 3\mu_2(\sigma)\mu_1(\sigma) + \mu_3(\sigma) \\
&= e^{3\sigma^2/2}\left(2 - 3 e^{\sigma^2} + e^{3\sigma^2}\right) \\
&= e^{3\sigma^2/2}\left(e^{\sigma^2}+2\right)\left(e^{\sigma^2}-1\right)^2.
\end{aligned}$$
By definition, the skewness is
$$\operatorname{Skew}(Z) = \frac{\mu^\prime_3(\sigma)}{\mu^\prime_2(\sigma)^{3/2}} = \frac{e^{3\sigma^2/2}\left(e^{\sigma^2}+2\right)\left(e^{\sigma^2}-1\right)^2}{\left[e^{\sigma^2}\left(e^{\sigma^2}-1\right)\right]^{3/2}} = \left(e^{\sigma^2}+2\right)\sqrt{e^{\sigma^2}-1}.$$
Comments and Generalizations
Higher standardized central moments (e.g. the kurtosis) are readily computed in the same way: $(*)$ and $(**)$ reduce the problem to polynomial algebra (the variable is $\exp(\sigma^2/2)$).
Because $\mu$ is a scale parameter for the Lognormal family (corresponding to a scale factor of $e^\mu$), it can be introduced into the formulas $(*)$ directly, where its $k^\text{th}$ power $\left(e^\mu\right)^k = e^{k\mu}$ will multiply the result, giving the general formulas
$$\mu_k(\mu,\sigma) = E\left[Z^k\right] = \exp\left(k\mu + k^2\sigma^2\right)$$
and then, of course,
$$\mu^\prime_k(\mu,\sigma) = E\left[\right(Z - E[Z]\left)^k\right] = e^{k\mu} \mu^\prime_k(\sigma).$$
Best Answer
If $Y $ ~ $\log N(\mu,\sigma^2)$ then $$f_Y(y) = \frac{1}{y\sqrt{2\pi \sigma^2}}\exp\left[-\frac{1}{2}\left(\frac{\log y-\mu}{\sigma}\right)^2\right]$$
Now,
$$F_Y(y)=\int_{-\infty}^{y_1}f_Y(y)dy = \int_{-\infty}^{y_1}\frac{1}{y\sqrt{2\pi \sigma^2}}\exp\left[-\frac{1}{2}\left(\frac{\log y-\mu}{\sigma}\right)^2\right]dy$$
$$\frac{\log y - \mu}{\sigma} = z \implies \frac{dy}{y\sigma} = dz, \text{ and } z_1 = \frac{\log y_1 - \mu}{\sigma}$$
$$F_Y(y)=\frac{1}{\sqrt{2\pi}}\int_{-\infty}^{z_1}\exp(-z^2/2)dz$$
Oh boy, that's a standard normal distribution. I think it's easy from here.