[Math] Autocorrelation of a ±1-valued random process with certain statistics

st.statisticsstochastic-processes

Suppose $f(t)$ is a continuous-valued, zero-mean stochastic signal with Gaussian autocorrelation (with variance $\sigma^2$). Suppose I then pass this signal through a step function, producing a new $\pm$1 signal $g(t)$ that has value +1 wherever $f(t) \ge 0$, and -1 wherever $f(t) < 0$. This signal is similar to a random telegraph signal, though perhaps with different statistics. My question is: does the autocorrelation of $g(t)$ have a simple form? If it were a random telegraph signal where the switching was described by a Poisson process, then I take it the autocorrelation would decay exponentially with the lag. But the switches between the two states (+1 and -1) don't seem to be independent here. Any guidance or references would be helpful.

More generally, suppose f is composed with a "soft" step function, such as the error function $\mathrm{erf}(u)$. For instance, let $g(t)=\mathrm{erf}(\alpha f(t))$. For large $\alpha$, this approaches the case above. In general, what is the autocorrelation of $g$?

Best Answer

Assume $(f(t))_t$ is Gaussian and centered with variance $\sigma^2$ and autocorrelation $E[f(t)f(s)]=c(t,s)$ and let $g(t)=\mathrm{sgn}(f(t))$. Then $(g(t))_t$ is $\pm1$ and centered with autocorrelation $$ E[g(t)g(s)]=\frac{2}{\pi}\arcsin\left(\frac{c(t,s)}{\sigma^2}\right).$$

Now I answer the OP's more general question.

The computation in the sign function case is based on the fact that $(f(t),f(s))$ is a Gaussian vector distributed like $(N,aN+bN')$ with $a=c(t,s)/\sigma^2$ and $b=\sqrt{1-a^2}$, for two independent centered Gaussian $N$ and $N'$ with variance $\sigma^2$. One computes $$ E[g(t)g(s)]=2P[N>0,N'\ge -uN] $$ as a two-dimensional Gaussian integral for the suitable value $u=a/b$ and this yields the result.

Likewise, if $g(t)=h(f(t))$ for a given odd function $h$, $E[g(t)g(s)]=E[h(N)h(aN+bN')]$ and it remains to compute this two-dimensional Gaussian integral.

If $h(u)=\mathrm{erf}(\alpha u)$ as suggested by the OP, (I believe that) one gets $$ E[g(t)g(s)]=\frac{2}{\pi}\arcsin\left(\frac{\alpha^2c(t,s)}{1+\alpha^2\sigma^2}\right). $$

Related Question