[Math] Conditional Moment Generating Function With A Twist

probabilityprobability distributionsprobability theoryreal-analysis

Let $X$, $X'$ be identically distributed (not necessarily iid) random variables with compact support, on the same probability space. Define

$G_t(x):=\mathbb{E}[e^{t(X'-X)} | X=x]$

In other words a kind of conditional moment generating function. Now suppose that for each $t>0$ the equation $G_t(x)=1$ has a unique solution $x=y_t$. Now, define

$I_t=\inf\{ \left|\frac{G_t(y_t)-G_t(x)}{y_t-x}\right| \ | \ x\in\mbox{supp}(X),\ x\neq y_t\}$

Question: Under what conditions for $X,X'$ is $I_t>0$ for every $t>0$?

I've tried supposing that $G_t(x)$ is at least continuously differentiable. Moreover if we try supposing that $X$ has a differentiable conditional density function $f(X'|X)$, then taking derivatives of $G_t$ gives

$G_t(x)=-tG_t(x)+\int e^{t(x'-x)}f_x(x'|x)dx'$

where all derivatives are with respect to $x$. Plugging in $x=y_t$ slightly simplifies the above but it doesn't seem too tractable or sensible.

I've also tried testing $G_t$ for convexity but this doesn't seem to work because of the conditioning part of the expectation.

I've also tried numerous tests of random variable pairs which have all corroborated the positivity of $I_t$. So I'm expecting a slightly contrived counterexample if there is one.
I would sincerely appreciate references!

Best Answer

Assume that the distribution of $X$ is symmetric and that $X'=SX$ where $S$ is independent of $X$ and $S=\pm1$ with equal probability. Then $X'$ and $X$ are identically distributed and, unless I am mistaken, $G_t(x)=\frac12(1+\mathrm e^{-2tx})$ hence $y_t=0$ for every $t\ne0$. For every $t\gt0$, the function $x\mapsto G_t(x)$ is decreasing hence, for every $x\ne y_t$, $$ \frac{G_t(y_t)-G_t(x)}{y_t-x}\lt0. $$ In particular, $I_t\lt0$ for every $t\gt0$.

Related Question