If all powers of two random variables are uncorrelated, are they independent

examples-counterexamplesmoment-problemprobabilityprobability theory

Let $X$ and $Y$ be random variables on a common probability space. If $$\def\E{\mathbb E}\E[X^nY^m]=\E[X^n]\,\E[Y^m]<\infty $$ for all integers $n,m\ge 0$, does it follow that $X$ and $Y$ independent?

I strongly suspect the answer is no. In the same way that the a random variable is not determined by its moments, a random vector is not determined by its joint moments. I have looked at several counter-examples of distinct random variables with the same moments, which I found here. The examples are derived by looking at characteristic functions, but I am not sure if they can be generalized to multivariate characteristic functions.

This question shows that $\E[f(X)g(Y)]=\E[f(X)]\,\E[g(Y)]$ for all bounded and continuous $f$ and $g$ implies $X$ and $Y$ are independent. My question is if we get the same result when we restrict $f$ and $g$ to be polynomials.

Best Answer

Updated answer

The answer is "it depends". See this MO question, and the references therein, in particular this SE question. The latter reference claims that it is true under certain assumptions on the size of the moments.


Linked answer by Iosif Pinelis, copied verbatim to make this post self-contained (not my own work!)

The answer is no. Indeed, let $X:=\xi$ and $Y:=\eta$. Let $U$ and $V$ be any independent random variables (r.v.'s) with different distributions but with the same finite moments of all orders: $$EU^m=EV^m=:\mu_m$$ for all natural $m$. A standard example of the distributions of such r.v.'s $U$ and $V$ is given in the answer by saz.

Let the cumulative distribution function (cdf) $F_{X,Y}$ of the random pair $(X,Y)$ be the half-and-half mixture of the cdf's $F_{U,V}$ and $F_{V,U}$, so that \begin{equation*} F_{X,Y}(x,y)=\frac{F(x)G(y)+G(x)F(y)}2 \tag{1} \end{equation*} for all real $x,y$, where $F$ and $G$ are the cdf's of $U$ and $V$, respectively. Then for the cdf's $F_X$ and $F_Y$ one has $F_X=F_Y=\frac{F+G}2$ and hence \begin{equation*} 4[F_X(x)F_Y(y)-F_{X,Y}(x,y)]=[F(x)-G(x)][F(y)-G(y)]\ne0 \end{equation*} for some real $x,y$, so that $X$ and $Y$ are not independent.

However, \begin{equation*} EX^mY^n=\tfrac12\,EU^m\, EV^n+\tfrac12\,EV^m\,EU^n=\mu_m\mu_n=EX^m\,EY^n \tag{2} \end{equation*} for all natural $m,n$.


Old answer, left as I feel it gives at least a non-zero amount of insight

I'm not sure whether this should be posted as an answer or a comment, as it's certainly not complete. That said, I went for "answer", as people should be able to comment explicitly on it, agreeing, disagreeing or just commenting in general.

I think if one restricts to random variables with compact support, then it may be true. (I don't have a complete proof! -- otherwise, I'd remove the "may".) By Stone-Weierstrass, one can well-approximate any continuous, bounded function on a compact interval by a sequence of polynomials. By inspection of the proof in the linked question, it seems that one may well be able to apply a two-limit argument. The issue, though, is that one wouldn't have monotone convergence.

In particular, one has to approximate $1(\cdot \in I)$, for a bounded, open interval $I$. If the support of your random variable is $S \subseteq \mathbb R$ and is compact (or rather a subset of a compact set), then one can extend $S$ 'by $1$' to get $S'$ -- if $S = [a,b]$, then look at $S' = [a-1, b+1]$ -- and approximate $1(\cdot \in I)$ on $S'$. This may help apply the limits, as the approximating polynomials may well be better behaved.

Hopefully this answer is at least vaguely interesting!