random-variable – What Transformations Satisfy Covariance Relations?

covariancerandom variable

Given two random variables (say standard Normals) that are not necessarily independent, are there non-linear functions for which
$$
\text{Cov}(g(X), g(Y)) \le c(g) \text{Cov}(X,Y),
$$

where $c(g)$ is some constant that can depend on the function $g$. In other words, are there classes of functions $g$ that have covariance not too different from the covariance of the underlying variables?

update: Given the comments, my question is ill-posed, what I would like is an analog of Lipschitz continuity, so it should read that for any two gaussian random variables, the above bound holds, so the function $g$ can alter the covariance of the original variables by at most a fixed constant.

Best Answer

Consider, as you say, standard Normal variates and $g(x) = x/10$ if $|x| < 100$, $0$ otherwise. It should be intuitively obvious that the covariance between $g(x)$ and $g(y)$ is less than that of $x$ and $y$, so $c(g) = 1$ will do.

Clearly this approach is generalizable to other distributions; rescale the variables to be much smaller over the region where virtually all the probability mass lies, then impose some trivial nonlinearity in the region where there is essentially no probability mass. The covariance of the transformed variables will (with a little care) be less than that of the original variables, so $c(g) = 1$ will do.

Related Question