Tight Lower Bound for Expectation of Product of Two Positive Valued Random Variables – Probability

cauchy-schwarz-inequalityexpectationinequalitiespr.probabilityprobability distributions

Let $X,Y$ be two (dependent) random variables with $\mathbb{P}(X\ge 0)=\mathbb{P}(Y\ge 0)=1$.

I want to find a tight lower bound of $\mathbb{E}(XY)$ when $X,Y$ are non-negative, almost surely.

Note that one trivial lower bound is $0$. But I want to know if there is an elegant method in literature for finding a tight lower bound, which has information of only the individual moments of the random variables $X,Y$.

I realize that the following lower bound can be obtained simply by using Cauchy-Scwartz inequality $$\mathbb{E}(XY)\ge \mu_X\mu_Y-\sigma_X\sigma_Y,$$ where $\mu_X=\mathbb{E}X,\ \sigma_X^2=\mathrm{Var}(X)$, and similarly $\mu_Y,\sigma_Y$ defined for $Y$. However, I don't know that if $X,Y$ are non-negative, almost surely, then whether this lower bound is non-negative. A little bit of algebra shows that this claim is equivalent to the following claim $$\frac{\mu_X^2}{\mu_{X^2}}+\frac{\mu_Y^2}{\mu_{Y^2}}\ge 1.$$ However, when $X=Y$, this claim is equivalent to proving that $$\mu_X^2\ge \frac{\mu_{X^2}}{2},$$ which, however, is false in general because Holder's inequality implies that $|\mu_X|\le \frac{\sqrt{\mu_{X^2}}}{\sqrt{2}}$. Any ideas how I can proceed? Thanks in advance.

Best Answer

$\newcommand{\de}{\delta} \newcommand{\si}{\sigma} \newcommand{\ep}{\varepsilon}$ Let us present the exact lower bound on $EXY$ in terms of $\mu_1:=\mu_X$, $\mu_2:=\mu_Y$, $\si_1:=\si_X$, $\si_2:=\si_Y$, as follows:

The minimum of $EXY$ over all nonnegative random variables (r.v.'s) $X$ and $Y$ with prescribed positive values of $\mu_1$, $\mu_2$, $\si_1$, $\si_2$ is \begin{equation*} (\mu_1\mu_2-\si_1\si_2)_+, \tag{1} \end{equation*} where $u_+:=\max(0,u)$.

Indeed, by rescaling, without loss of generality \begin{equation*} \mu_1=\mu_2=1. \end{equation*} Let \begin{equation*} v_j:=1+\mu_j^2,\quad r_j:=1/v_j; \end{equation*} everywhere here $j\in\{1,2\}$. Observe that \begin{equation*} \mu_1\mu_2-\si_1\si_2\le0\iff\si_1\si_2\ge1\iff r_1+r_2\le1. \tag{2} \end{equation*} Consider the two possible cases, according to this observation.

Case 1: $r_1+r_2\le1$. Then the bound (1) is $0$. On the other hand, let $S:=\{0,1,2\}$ with the probability measure on $S$ assigning masses $1-r_1-r_2,r_1,r_2$ to the points $0,1,2$, respectively. Let r.v.'s $X$ and $Y$ be defined on $S$ as follows: \begin{equation*} X(0)=X(2)=0,\ X(1)=v_1,\quad Y(0)=Y(1)=0,\ Y(2)=v_2. \end{equation*} Then \begin{equation*} EX=EY=1,\quad \operatorname{Var}X=\si_1^2,\quad \operatorname{Var}Y=\si_2^2,\quad EXY=E0=0. \end{equation*} So, the bound $(1)$ is exact in Case 1.

Case 2: $r_1+r_2\ge1$. Then the bound (1) follows indeed by Cauchy--Schwarz: \begin{equation*} EXY-\mu_1\mu_2=E(X-\mu_1)(Y-\mu_2)\ge-\si_1\si_2. \end{equation*} By (2), in Case 2 we have $\si_1\si_2\le1$. So, we can find $p\in(0,1)$ such that for $q:=1-p$ we have \begin{equation*} \si_2^2\le p/q\le1/\si_1^2. \end{equation*} Let now $S:=\{0,1\}$ with the probability measure on $S$ assigning masses $q,p$ to the points $0,1$, respectively. Let r.v.'s $X$ and $Y$ be defined on $S$ as follows: \begin{equation*} X(0)=1-\si_1\sqrt{p/q},\quad X(1)=1+\si_1\sqrt{q/p}, \end{equation*} \begin{equation*} Y(0)=1+\si_2\sqrt{p/q},\quad Y(1)=1-\si_2\sqrt{q/p}. \end{equation*} Then $X,Y\ge0$, \begin{equation*} EX=EY=1,\quad \operatorname{Var}X=\si_1^2,\quad \operatorname{Var}Y=\si_2^2,\quad EXY=1-\si_1\si_2. \end{equation*} So, the bound $(1)$ is exact in Case 2 as well. $\Box$

You were pretty close to this answer.

Related Question