Expected Value – Using Leibniz’s Rule for Integrals with Infinity Bounds in Quantile Regression

expected valuequantile regression

Hi I came across one application of Leibniz's rule in quantile regression:

$$ \frac{\partial E[\rho_\tau(y-c)]}{\partial c} = \frac{\partial}{\partial c} \left[(1-\tau) \int_{-\infty}^{c}(c-y) f_Y(y) \,dy \, + \tau \int_{c}^{\infty} (y-c) f_Y(y) dy \right]$$ where I am now a little confused as to which values to use for $ a'({\theta}), b'(\theta)$ in the Leibniz rule definition $F(\theta) = \int_{a(\theta)}^{b(\theta)} f(\theta, x) dx$ and $$ \frac{d}{d\theta} F(\theta) = f(\theta, b(\theta))b'(\theta) – f(\theta, a(\theta))a'(\theta) + \int_{a(\theta)}^{b(\theta)} \frac{\partial f(\theta, x)}{\partial \theta} dx$$.

Since now, my $a(\theta) = -\infty$, naively you get $\frac{d}{dc}(-\infty) = 0$ which leads to the correct result if used for all infinite values, but doesn't feel quite right. Is this the way to go, or should one instead use a distributional argument when plugging in $\infty$ into $f_Y(y)$ as a density?

The answer at Quantile regression: Loss function unfortunately doesn't specify the first order condition that I'm interested in.

Thank you guys!

Best Answer

You don't need any advanced differentiation rules or limiting arguments: the basics will suffice. These are the Fundamental Theorem of Calculus, the sum rule of differentiation, and the Chain Rule.


Let $$F(x) = \int_{-\infty}^x f_Y(y)\,\mathrm{d}y.$$ The Fundamental Theorem of Calculus tells us $F$ is differentiable with derivative $$F^\prime(x) = f_Y(x).$$ Similarly, the function $$G(x) = \int_{-\infty}^x yf_Y(y)\,\mathrm{d}y$$ (assuming it converges) is differentiable with derivative $$G^\prime(x) = xf_Y(x).$$

(If you are familiar with this theorem applied only to finite intervals, write $$F(x) = \int_{-\infty}^0 f_Y(y)\,\mathrm{d}y + \int_0^x f_Y(y)\,\mathrm{d}y,$$ notice that the first term is a constant, and apply the sum rule of differentiation. Use the same method to find $G^\prime.$)

For convenience, write $G(\infty)=\lim_{x\to\infty}G(x)$ (which we assume exists and is finite) and $F(\infty) = \lim_{y\to\infty} F(x) = 1.$

Define the function $g:\mathbb{R}^4\to\mathbb{R}$ as

$$\begin{aligned} g(a,b,c,d) &= (1-\tau) \int_{-\infty}^{a}(b-y) f_Y(y) \,\mathrm{d}y \, + \tau \int_{c}^{\infty} (y-d) f_Y(y) \mathrm{d}y \\ &= (1-\tau) \left[bF(a) - G(a)\right] + \tau \left[G(\infty)-G(c) - d(F(\infty)-F(c))\right]. \end{aligned}$$

By virtue of the preceding results, $g$ is differentiable with derivative

$$\begin{aligned}Dg(a,b,c,d) &= \left(\frac{\partial g}{\partial a},\frac{\partial g}{\partial b},\frac{\partial g}{\partial c},\frac{\partial g}{\partial d}\right)(a,b,c,d)\\ & = ((1-\tau)\left[b f_Y(a)-a f_Y(a)\right],\ (1-\tau) F(a),\\ &\quad\quad\quad \tau \left[d f_Y(c)- c f_Y(c)\right],\ -\tau (1-F(c))). \end{aligned}$$

Let $\iota:\mathbb R \to \mathbb{R}^4$ be the function

$$\iota(c) = (c,c,c,c).$$

Its derivative is (obviously) the constant vector $D\iota(c)=(1,1,1,1)^\prime.$ But because

$$E\left[\rho_t(y-c)\right] = g(c,c,c,c) = g(\iota(c)) = (g \circ \iota)(c) ,$$

the (multivariate) Chain Rule yields

$$\begin{aligned} \frac{\mathrm{d}E\left[\rho_t(y-c)\right]}{\mathrm{d}c} &= (Dg)(\iota(c)) \circ D\iota(c)\\ &=(1-\tau)\left[c f_Y(c)-c f_Y(c) + F(c)\right] + \\ & \quad\quad\tau \left[c f_Y(c)- c f_Y(c) - (1-F(c)\right]\\ &= F(c) - \tau. \end{aligned}$$

Related Question