Derivatives – Derivative of Cumulative Normal Distribution Function with Respect to One of the Limits

derivativesnormal distribution

I have an optimization problem where gradient information would be necessary to use inside optimization tools. The problem is: the objective function is defined using the complementary cumulative normal distribution function $F(a)$ (with mean $\mu$ and standard deviation $\sigma$). So, is there a way to express the derivative w.r.t $a$ in an analytical form? Take $\phi$ for the Normal Distribution Function and $\Phi$ for the standard cumulative normal distribution function.
Is it true that
$$\frac{\partial F(a)}{\partial a} = \frac{\partial}{\partial a}\int_a^{+\infty}\frac{1}{\sigma\sqrt{2\pi}} \mbox{exp}\left(-\frac{(t-\mu)^2}{2\sigma^2}\right)dt$$
$$\frac{\partial F(a)}{\partial a} = 1-\frac{\partial}{\partial a}\Phi((t-\mu)/\sigma)$$
holds ? If so, can this be further simplified?
Thanks in advance!

Best Answer

$F_x = 1 - \Phi((a-\mu)/\sigma))$, where $\Phi$ is the standard Normal distribution function. Its derivative w.r.t. $a$ therefore is $-\phi((a-\mu)/\sigma)/\sigma$, where $\phi$ is the standard Normal density function. I.e., substitute a for x in your integrand and take the negative.

Of course, this is just a special case of https://en.wikipedia.org/wiki/Leibniz_integral_rule .

Related Question