Show that $\exp(X)$ has a lognormal pdf if $X$ has a normal pdf

gaussiannormal distributionprobabilityprobability distributions

Given: $X \sim \mathcal{N}(\mu, \sigma^2)$

Show that $\exp(X)$ has a lognormal distribution.

Attempt:
\begin{align}
\text{Let } Y = e^X \ \ \text{and} \ \ Y \sim p_Y(y) \\
P(Y \leq t) = \int_{-\infty}^{t} f_Y(y)dy \\
P(Y \leq t) = P(e^X \leq t) = P(X \leq ln(t)) \\
P(X \leq ln(t)) = \int_{-\infty}^{ln(t)} \mathcal{N_X(\mu, \sigma^2)}dx = \int_{-\infty}^{ln(t)} \frac{1}{\sqrt{2\pi}\sigma}e^{\frac{-(x-\mu)^2}{2\sigma^2}}dx = \int_{-\infty}^{t} f_Y(y)dy
\end{align}

I am stuck at this point. I know what the final form looks like, but I can't figure out how to get there. I do not believe that I need to evaluate $\int_{-\infty}^{ln(t)} \frac{1}{\sqrt{2\pi}\sigma}e^{\frac{-(x-\mu)^2}{2\sigma^2}}dx$?

Any hints?

Best Answer

You've reached

$$ P(Y \leq t) = \int^{\log t}_{-\infty} f_X(x) dx$$

and as you've said, you don't need to evaluate that integral to get $f_Y(t),$ you just need to differentiate both sides with respect to $t.$ By the fundamental theorem of calculus (and the chain rule), you get $$ f_Y(t) = f_X(\log t) \cdot \frac{1}{t}$$ which is the density function you are looking for.

More generally, if $X$ and $Y$ are random variables such that $Y= g(X)$ for some monotonically increasing function $g: \mathbb{R} \to \mathbb{R}$ then we have $$f_Y(y) = f_X( g^{-1}(y) ) \cdot \frac{d}{dy}g^{-1}(y).$$

This is proved by following essentially the same steps that you did above.

Related Question