Markov Chains – When the Sum of Independent Markov Chains is a Markov Chain

examples-counterexamplesmarkov chainsprobabilityprobability theory

I try to find as much as possible cases, when the chain $Z(t) = |X_1(t)-X_2(t)|$ is Markov, where $X_1(t)$ and $X_2(t)$ are independent, discrete-time and space, preferably non-homogeneous Markov chains.
I started to search for the sum of independent Markov chains and I found this statement in Stoyanov J. – Counterexamples in Probability (2ed., Wiley, 1997)(p.229, one can google it and find in google books):

… the sum of two Markov processes need not be a Markov process. Note, however, that that the sum of two independent Markov processes preserves this property.

That seems very strange to me and I wish to find the proof or at least the statement elsewhere.


EDIT: The way of thinking how it may look like for a sum of two independent Markov chains (If I want to prove that the sum of two independent Markov chains is again a Markov chain):

Let $Y(n) = X_1(n)+X_2(n)$. Then

$P(Y(n+1) = i_{n+1} \ | \ Y(n) = i_n, …, Y(0)=i_0) =$

$= P(X_1(n+1)+X_2(n+1)=i_{n+1}\ | \ X_1(n)+X_2(n)=i_n,…,X_1(0)+X_2(0)=i_0)=$

$=/ (?) / = \sum_{j+k=i_{n+1}}P(X_1(n+1)=j,X_2(n+1)=k \ | \ \cdot)=$

$=/\text{X's are independent}/ = \sum_{j+k=i_{n+1}}P(X_1(n+1)=j \ | \ \cdot)\cdot P(X_2(n+1)=k \ | \ \cdot) = $

$= /\text{Markov property + (??) } / = \sum_{j+k=i_{n+1}}P(X_1(n+1)=j \ | \ X_1(n)+X_2(n)=i_n)\cdot P(X_2(n+1)=k \ | \ X_1(n)+X_2(n)=i_n)=$

$=P(X_1(n+1)+X_2(n+1)=i_{n+1} \ | \ X_1(n)+X_2(n)=i_n) = P(Y(n+1)=i_{n+1}\ | \ Y(n)=i_n)$.

Here I am uncertain about (?) and (??) steps.

Best Answer

In general, the sum of two independent Markov chains is not a Markov chain.

Let $X$ be a random variable such that $\mathbb{P}(X=0) = \mathbb{P}(X=1) = \frac{1}{2}$ and set $X_n := X$ for all $n \in \mathbb{N}$. Obviously, $(X_n)_{n \in \mathbb{N}}$ is a Markov chain. Moreover, let $(Y_n)_{n \in \mathbb{N}_0}$, $Y_0 := 0$, be a Markov chain independent from $X$ with state space $\{-1,0,1\}$ and transition matrix

$$P := \begin{pmatrix} \frac{1}{4} & \frac{3}{4} & 0 \\ \frac{1}{4} & \frac{1}{2} & \frac{1}{4} \\ 0 & \frac{3}{4} & \frac{1}{4} \end{pmatrix}.$$

Now set $Z_n := X_n+Y_n$. Then, by the independence of $X$ and $(Y_n)_{n \in \mathbb{N}_0}$, we have

$$\begin{align*} \mathbb{P}(Z_2 = 0 \mid Z_1 = 1, Z_0 = 1) &= \frac{\mathbb{P}(Z_2 = 0, Z_1 = 1, Z_0 = 1)}{\mathbb{P}(Z_1 =1, Z_0=1)} \\ &=\frac{\mathbb{P}(Y_2 = -1, Y_1 = 0, X= 1)}{\mathbb{P}(Y_1 =0, X=1)} \\ &= \frac{\mathbb{P}(Y_2 = -1, Y_1 = 1)}{\mathbb{P}(Y_1 = 0)} \frac{\mathbb{P}(X=1)}{\mathbb{P}(X=1)} \\ &= \mathbb{P}(Y_2 = -1 \mid Y_1 = 0) = \frac{1}{4}. \end{align*}$$

On the other hand, a very similar calculation shows that

$$\begin{align*} \mathbb{P}(Z_2 = 0 \mid Z_1 = 1) &= \frac{\mathbb{P}(Y_2 = -1, Y_1=0) \mathbb{P}(X=1) + \mathbb{P}(Y_2 = 0, Y_1=1) \mathbb{P}(X=0)}{\mathbb{P}(Y_1 = 0) \mathbb{P}(X=1) + \mathbb{P}(Y_1=1) \mathbb{P}(X=0)} \\ &= \frac{5}{12} \neq \frac{1}{4}. \end{align*}$$

This means that $(Z_n)_{n \in \mathbb{N}_0}$ is not a Markov chain.

Related Question