Sum of iid random variables is a markov chain

markov chainsmarkov-processprobabilityprobability theory

Suppose I have a sequence of iid random variable, $Z_i$ for $i=0,1,2,3…$ such that $$\Bbb P(Z_i=z)=
\begin{cases}
p, &\text{if } z=1 \\
1-p, & \text{if } z=0.
\end{cases}$$

Define $S_k=\sum^k_{i=0}Z_i$ and $\prod_n=\prod^k_{i=0}Z_i$ .

So I have to determine as an exercise in my textbook whether $X_n=S_n$ and $Y_n=\prod_n$ are markov chains.

What I tried to do is express $X_{n+1}$ in terms of $X_n$ and something only generated at time $n + 1$: $X_{n+1}=Z_{n+1}+ \sum^n_{i=0}Z_i$ from this I somehow can show that $X_n$ is markov? I'm not quite sure how to use $Z_i$'s probabilities. I'm stuck here, any help would be appreciated.

Edit:

I solved this using the reasoning in the commments below and I would like to check whether my reasoning is correct:

1) So since $X_{n+1}= X_n +Z_{n+1}$ , it follows that $X_{n+1}$ is Markov since the $ n+1$-th value depends not on any of $X_0,…,X_{n-1}$ . It only depends on $X_n$ and a random variable $Z_{n+1}$ independent of $X_0,…,X_{n-1}$.

This I kind of don't understand myself. Why is $X_{n+1}= X_n +Z_{n+1}$ independent of $X_0,…,X_{n-1}$ ? Isn't $X_n=X_{n-1} + Z_n$?(And if we iterate this to $X_1$ then we could say that $X_{n+1}$ is equal to $X_0 + Z_1 +…+Z_n+Z_{n+1}$ so this means that it does depend on those values?

For example what if I take that $Z_0,…,Z_n=(0,….,0)$ and hence $X_{n+1}$ would be equal to $0$ or $1$ but what if I choose $Z_0$ to be $1$ then the sum would be $1$ or $2$ ? So this means that it is dependent of $X_0…X_{n-1}$? Am I missing something here? Also, since the exercise gives probabilities with which $Z_i$ takes value $\{0,1\}$ , how are we supposed to use these?

2) It's markov for the same reason as 1)

I have the same doubts about the second one. What if $Z_0$ is equal to $0$? Then the whole product is equal $0$? And if $Z_0$ is 1 and all others are $1$ then it will be $1$ or $0$(depending on the $Z_{n+1}$-th value)? So this again is dependent of the preceeding values? Could someone please explain me if these answers are correct and explain why my reasoning is wrong, and maybe intuitively explain why these are Markov's. I would really like to understand this intuitively.

Also, if possible, could someone please say whether 3) $X_n= S_0+…+S_n$ and 4) $X_n= (S_n,S_0+…+S_n)$ are Markov.

I'm also interested in how could I prove these answers rigourously by definition which is if $\Bbb P(X_{n+1}=x_{n+1} | X_0=x_0,…,X_n=x_n)= \Bbb P(X_{n+1}=x_{n+1} | X_n=x_n)$ then $X_{n+1}$ is Markov

I have asked this question on CV but did not get any answers(even with bounty) I'd really like to understand it!

Best Answer

If $f$ is a bounded measurable real-valued function, then by the law of total probability, $$ \mathbb E[f(S_{n+1})\mid\mathcal F_n] = \mathbb E[f(S_n+X_{n+1})\mid\mathcal F_n]. $$ Since $S_n$ is $\sigma(S_n)$-measurable and $X_{n+1}$ is independent of $\mathcal F_n$, it follows that $$ \mathbb E[f(S_n+X_{n+1})\mid\mathcal F_n] = \mathbb E[f(S_n+X_{n+1})\mid S_n], $$ so that $\{S_n\}$ is a Markov chain.

Similarly, we have $$ \mathbb E[f(Y_{n+1}) \mid\mathcal F_n] = \mathbb E[f(Y_nX_{n+1})\mid\mathcal F_n] = \mathbb E[f(Y_nX_{n+1})\mid Y_n], $$ so that $\{Y_n\}$ is a Markov chain.

Related Question