Getting started with Markov Chain.

markov chainsprobabilityrandom variablesstatisticsstochastic-processes

I have just started to study Markov Chains from the last chapter (Page-$130$) of the book of Seymur Lipchuz.

I haven't understood this particular sentence: "We now consider a sequence of trials whose outcomes, say, $X_1$,$X_2$, …" .

What do those random variables represent, say, in the following matrix?

A Markov Chain $(X_n)_n$ has the following transition matrix:
$$P = \begin{bmatrix}
0.1 & 0.3 & 0.6\\
0 & 0.4 & 0.6\\
0.3&0.2&0.5
\end{bmatrix}$$

with initial distribution $\alpha = (0.2, 0.3, 0.5)$.

What do the terms $X_n$ mean in this context?

What does the term, say, $P(X_9 = 2|X_1 = 2, X_5 = 1, X_7 = 3)$ mean in this context?

what does the term $EX_2$ mean in this context?

Best Answer

$X_i$ is the result of the $i$th "trial" in a sequence. The Markov Chain has three states and the transition matrix $P$ gives the probabilities of moving between states for any particular step in the chain (from state row# to state column#). Note that the row sums of $P$ all equal $1$.

For example, if the state at time $n$ is $1$, the probabilities for the state at time $n+1$ are given by the first row. The probability of state $n+1$ equalling one, given that the state $n$ was one, equals 0.1. This is written $P(X_{n+1} = 1|X_n=1) = 0.1$. The rest of the row gives the other possibilities for transition from state one: P(state $1 \rightarrow$ state $2$) $= 0.3$, P(state $1 \rightarrow$ state $3$) $=0.6$.

Lipchuz's point (ii) is important: The outcome of any trial depends at most upon the outcome of the immediately preceding trial.

This means that the conditional probability: $$𝑃(𝑋_9=2|𝑋_1=2,𝑋_5=1,𝑋_7=3)$$ is supplying unnecessary information. It might be translated to: What is the probability of moving from "state 3" to "state 2" in exactly $9-7=2$ steps.