Solved – the combination of two independent continuous time Markov chains

markov-process

Assume one variable $x$ has two states 0 and 1, $x$ changes between 0 and 1 following a continuous time Markov chain. The transition probability is represented as matrix $P$ and the time sojourning on a state follows exponential distribution with arrival ratio $\lambda$.

There is the other variable y which is independent with $x$. $y$ also changes between 0 and 1, and have the same $P$ and $\lambda$ with $x$.

Now the question is: will the new variable $z=x*y$ follow continuous time Markov chain? if yes, what will be the transition probability and $\lambda$. ($z$ only has two states: 0 and 1.)

I guess $z$ still follows continuous time Markov chain, but I have no idea how to prove? (maybe there has been a property stating this, which I don't know?) Could anyone give some ideas? Many thanks!

Best Answer

The product $z_t$ is not a Markov Chain (MC).

Assume we know that $z_{t_0} = 0$ at some time $t_0$, meaning that $x_{t_0}=0$ or $y_{t_0}=0$. Then the past values $\{z_{u};\,u <t_0\}$ of $z_t$ still contain information about future values $\{z_v;\, v > t_0\}$ in contradiction with the Markov property. Indeed, let $s_t := [x_t, \,y_t]$; so that $s_t$ is a MC taking the $4$ values written here as $00$, $01$, $10$ and $11$. At time $t_0$, only the first $3$ states are possible since the product is $0$. Let $t_0-W$ be the random time of the latest state change for $z_t$, which was $z: \,1 \rightarrow 0$. If $W$ is small, then most probably $s_{t_0}$ is $01$ or $10$, but not $00$: only one change of state occured during the interval $(t_0-W, \,t_0)$. This in turn tells us that the next transition $0 \rightarrow 1$ will occur more quickly than if a large value $W$ had been obtained.

Here is a more formal derivation. For a fixed $t$ and a small $h > 0$ we have $$ \mathrm{Pr}\{z_{t-h} = 1\,\vert \,s_{t} = 00\} = o(h), \qquad \mathrm{Pr}\{z_{t-h} = 1\,\,\vert \, s_{t} = 01\} = \lambda h + o(h) $$ with $\lambda >0$. Indeed, the first probability involves two transitions of $s_t$. Using Bayes formula $$ \mathrm{Pr}\{s_{t} = 00 \,\vert\, z_{t-h} = 1, \, z_{t}=0\} = \frac{\mathrm{Pr}\{ z_{t-h} = 1 \,\vert \,s_{t} = 00\} \, \mathrm{Pr}\{ s_{t} = 00\, \vert \, z_{t} = 0\}}{ \mathrm{Pr}\{ z_{t-h} = 1 \,\vert \, z_{t} = 0 \} }. $$ The numerator of the fraction is $o(h)$, while its denominator is easily found to be $\nu h + o(h)$ for some $\nu >0$, so the probability is $o(h)$. By contrast $$ \mathrm{Pr}\{s_{t} = 01 \,\vert\, z_{t-h} = 1, \, z_{t}=0\} = \rho h + o(h) $$ for some $\rho > 0$. Thus conditional on $\{z_{t-h} = 1, \, z_{t}=0\}$ the event $\{s_{t} = 01\}$ is much more probable than $\{s_{t} = 00\}$ for small $h$, as claimed.

Here $z_t$ follows a Hidden Markov Model (HMM) with hidden state $s_t$ and simply results from grouping $3$ possible states of $s_t$ as one $z_t=0$. More generally, grouping states of a MC $s_t$ will result in a HMM process but not a MC.

Related Question