Random walk on a cube; expected time spent on the opposite node before returning

combinatoricsexpected valuemarkov chainsprobabilityrandom walk

Problem:

A particle undergoes Random Walk on the eight vertices of a cube, by moving from a given vertex to any of the three adjacent vertices with the same probability $\frac{1}{3}$, independently of where it has been in the past and when. For two opposite vertices $x$ and $y$, calculate the expected time $\mathbb{E}\left(\sum_{n=0}^{T_x – 1} \mathbb{1}_{\{X_n = y\}}\right)$ spent at $y$ before returning to $x$. Here $T_x = \inf\{n \in \mathbb{N} : X_n = x\}$.

My attempt

$(X_n)_{n \in \mathbb{N}_0}$ is a Markov chain with $X_0 = x$ and state space being the set $\{x, y, z\}$, where state $z$ represents the other $6$ vertices of the cube. The transition probabilities are:

enter image description here

My plan is to first calculate $p_k$, the probability of $k$ visits to $y$ before returning back to $x$, where $k \in \mathbb{N}$, and then the required expectation should be
$$
\mathbb{E}\left(\sum_{n=0}^{T_x – 1} \mathbb{1}_{\{X_n = y\}}\right) = \sum_{k=1}^{\infty} k p_k
$$

I calculate $p_k$ as follows. A typical path starting from $x$, visiting $y$ $k$ times, and then going back to $x$ looks like:
$$
x zz y z y zzz y \cdots zx
$$

In the path above:

  1. There are $k$ occurrences of $y$.
  2. Between every occurrence of $x$ or $y$, there is at least one $z$.
  3. There are $k+1$ "boxes" for $z$'s. In each box there is at least one $z$.

Therefore, there must be a minimum of $(k+1)$ $z$'s. Suppose there are $m \in \mathbb{N}_0$ "extra" $z$'s, i.e., there are a total of $(k+m+1)$ $z$'s.

Probability of such a path $=\frac{1}{3^{k+m+1}}$

Number of such paths = Number of ways to put $m$ indistinguishable balls in $k+1$ boxes = $\binom{k+m}{m}$

Therefore,
$$
p_k = \sum_{m=0}^{\infty} \frac{1}{3^{k+m+1}} \binom{k+m}{m}
$$

Help

I am unable to simplify the expression for $p_k$ obtained above.

More generally, is there a better way to solve this problem?

I know this question asks something very similar, but I am unable to calculate the $p$ in the accepted answer, as you can see above.

Edit

As pointed out by joriki, the Markov chain constructed above is incorrect.

Best Answer

Your chain is wrong because the $6$ other vertices are different in that $3$ of them are adjacent to $x$ and $3$ are adjacent to $y$.

The answer is actually independent of the particular graph and holds for any pair of vertices in any irreducible Markov chain.

Let $p$ be the probability that the chain reaches $y$ before returning to $x$. The walk goes to $y$ with probability $p$, and then it keeps trying to return to $x$ with success probability $p$, which is expected to take $\frac1p$ tries. Thus the expected number of visits to $y$ before returning to $x$ is $p\cdot\frac1p=1$.

You can also see this from looking at the sequence of states in the long term. Since the two corners are visitied equally often, between any two instances of one there must on average be $1$ instance of the other.

The second argument works for any pair of states not necessarily related by symmetry, whereas in the first argument it’s not immediately obvious that the probability $p$ is the same in the two directions unless, as in this case, the two vertices are related by symmetry.