[Math] A function of two cumulative probability distributions with same first 2 moments

integral-inequalityintegrationprobability distributionsprobability theorystatistics

Let $\Phi_1$ and $\Phi_2$ be cumulative probability distribution functions with domain $[L, \infty)$, $L\geq 0$, both distributions having the same expectation $\mu$ and the same second moment (hence finite second moment, $\textbf{a modification and added constraint to the earlier post}$), and the kurtosis of the distribution behind $\Phi_2$ is higher than that of $\Phi_1$ (new constraint).

Looking for whether $G'-G \geq 0$ $, with

$$G=1-\frac{1}{\mu}\int_L^\infty \left(1- \Phi_1(x)\right)^2 \, \mathrm{d} x$$

$$G'=1-\frac{1}{\mu}\int_L^\infty \left(1- \frac{1}{2}\left(\Phi_1(x)+\Phi_2(x)\right)\right)^2 \, \mathrm{d} x$$

$\textbf{Approach: }$
Consider a square integrable function $s(x):[L,\infty) \rightarrow (-1,1)$, as a difference, with
$\Phi_2(x)=\Phi_1(x) + s(x)$.

$$G'-G=\frac{1}{\mu}\left(\int_L^{\infty } s(x) \, dx- \int_L^{\infty } s(x) \Phi (x) \, dx-\frac{1}{4}\int_L^{\infty } s(x)^2 \, dx\right) $$

It looks like we have $\int_L^\infty s(x) \, dx=0$ and $\int_L^\infty x \, s(x) \, dx=0$, since both distributions have the same first two moments and are in the positive domain, and integrating by parts we get $ \int_L^\infty \left(1-\Phi(x)\right) \, dx= \int_L^\infty \left(1-\Phi(x)-s(x)\right) \, dx$, and $ \int_L^\infty x \left(1-\Phi(x)\right) \, dx= \int_L^\infty x \left(1-\Phi(x)-s(x)\right) \, dx.$

What are the bounds on $G'-G$? Are there calculation mistakes in the above?

We also have $s(L)=s(\infty)=0, s(x)\leq 1-\Phi_1(x)$. By Cauchy-Schwarz, we also get $\left(\int s(x) \Phi (x)\right)^2 \leq \int s(x)^2 \int \Phi (x)^2$, but I can't see where this can be useful.

Best Answer

I don't know if that particularly helps (since it does not directly relate to the moments of distributions characterized by $F_1,F_2$, and relies on elementary calculations), but maybe it will foster further discussion. Let $f=\overline{F}_1$, $g=\overline{F}_2$ and $||f||^2=\int f^2(x)d x$ and assume that $F_1,F_2$ have the same mean, $||f||,||g||<\infty$. Then $G'-G\ge 0$ if and only if $$ 2\int (f^2-g^2)+\int(f-g)^2\ge 0. $$ The above shows immediatelly that if $||f||>||g||$ then $G'-G\ge 0$. Assume $||f||<||g||$ and denote $z=||g||/||f|| >1$. Further transofmations give another useful(?) iff condition: $$ ||f|| ||g|| \left( \frac{3}{z}-z-2\frac{<f,g>}{||f||g||}\right)\ge 0. $$ The term $\frac{<f,g>}{||f||g||}$ is the 'angle' or 'correlation' between $f$ and $g$ in $L^2$ (not to be confused with correlation between random variables with pdf's $F_1$ and $F_2$), hence takes values in $[-1,1]$. As a result $G'-G<0$, if $z>3$, and if $z\in(1,3)$ then it can go either way (depending on the assumed 'correlation').