Bernoulli – Markov chain

markov chainsprobabilitystochastic-processes

Let $Y_{n}$, $n≥0$ be an i.i.d. sequence of Bernoulli $(p)$ random variables for some fixed $p ∈
(0, 1).$

Let $X_{n} = Y_{n−1} + Y_{n}$ for $n ≥ 1$. Is $X_{n}$, $n≥1$ a Markov chain? If no explain why not. If
yes then determine the state space, the initial distribution, and the transition matrix.

We've just started studying Markov chains. By the Informal definition, it states that whatever happens in the future depends on the present state and not on the past.

For Bernoulli, the outcome of the $n^{th}$ toss is independent of the results of 1,2,……n-1.
If we know what happened at n-1, then any other information from the past is not required.

$$P(Y_{n+1} = j+1 / Y_{n} = j) = p $$ and

$$P(Y_{n+1} = j / Y_{n} = j) = 1-p $$

How can I related this information for $X_{n}$?

Best Answer

Hint: Suppose $Y_1 = 0, Y_2 = 1$; then $X_2 = 1$. What are the possibilities for $X_3$?

Now, suppose $Y_1 = 1, Y_2 = 0$. Again, $X_2 = 1$. What are the possibilities for $X_3$?