For starters, you have the wrong transition matrix. You want $P$ to be such that
$$P{1\brack 0}={\frac{1}{3}\brack \frac{2}{3}}\;,$$
meaning that if you start with something in-state $A$, after one transition it’s still in state $A$ with probability $\frac13$ and in-state $B$ with probability $\frac23$. Similarly, you want
$$P{0\brack 1}={4/5\brack 1/5}\;,$$
meaning that if you start with something in state $B$, after one transition it’s still in state $B$ with probability $\frac15$ and has moved over to state $A$ with probability $\frac45$. This means that you want
$$P=\begin{bmatrix}1/3&4/5\\2/3&1/5\end{bmatrix}\;.$$
For $n\in\Bbb N$ let $B_n$ be the $2\times 1$ matrix whose top entry is the probability that $X_n$ is $A$, and whose bottom entry is the probability that $X_n$ is $B$. You’re told that $$B_0={1/2\brack1/2}\;:$$ the process is equally likely to start in each of the states, so initially it’s in each state with probability $\frac12$. You also know that $B_{n+1}=PB_n$ for each $n\in\Bbb N$: that’s simply how the Markov process works. Thus,
$$B_1=\begin{bmatrix}1/3&4/5\\2/3&1/5\end{bmatrix}B_0=\begin{bmatrix}1/3&4/5\\2/3&1/5\end{bmatrix}\begin{bmatrix}1/2\\1/2\end{bmatrix}=\begin{bmatrix}17/30\\13/30\end{bmatrix}\;,$$
where I’ll leave the conclusion of the calculation to you.
Once you have this, you need to find the probability that $X_1=X_2$. That’s the probability that
$$(X_1=A\text{ and }X_2=A)\quad\text{or}\quad(X_1=B\text{ and }X_2=B)\;;$$
can you find that? Remember what the transition probabilities mean.
Best Answer
Yes, Markov processes with infinitely many states are indeed considered. Random walks are a common example. The term "Markov chain" is often reserved for the case of a discrete state space. If the state space is finite, it's a "finite Markov chain". See e.g. http://www.statslab.cam.ac.uk/~james/Markov/