For (c): Each corner of the hypercube can be labeled with a $k$-bit number.
The mouse starts at $000..0$, the cat at $111..1$. They start with $k$ bits different.
At each step, they both change one bit. If they start with $d$ bits different, then after that step they have either $d-2$, $d$ or $d+2$ bits different.
Work out the probabilities. It becomes a one-dimensional random walk, with those probabilities between points.
Finding an exact expression for $d_n$ is invariably going to lead to a messy formula that's probably the one you discovered, but it's not necessary to know the exact value of $d_n$ to find the limit.
First, there is a way to simplify the random walk. For our purposes, all vertices of $G$ at distance $d$ from the root are functionally identical: they all have a $\frac1k$ probability of going up to a vertex at distance $d-1$ from the root, and a $1-\frac1k$ probability of going down to a vertex at distance $d+1$ from the root. So instead of considering a random walk on $G$, we can consider a biased random walk on the states $\{0,1,2,\dots\}$ where each step is $+1$ with a probability of $1-\frac1k$ and $-1$ with a probability of $\frac1k$. (Except at $0$, which goes to $1$ with probability $1$.)
It's easier to solve for the behavior of a related Markov chain, which is the same random walk, but on the integers $\{\dots,-2,-1,0,1,2,\dots\}$, with no awkward boundary condition. Here, the expected state after $n$ steps is $\frac nk \cdot (-1) + n \cdot(1-\frac1k) \cdot (+1) = n(1 - \frac2k)$. To go from this to the statement $\lim_{n \to \infty} \frac{d_n}{n} = 1 -\frac2k$, we need some concentration inequality. Chebyshev's inequality should be sufficient, but in this case, we can use a Chernoff-type bound if we want to get tighter concentration, as well.
The random walk we got for analyzing the tree is slightly different: when we're at $0$, we don't have a $\frac1k$ chance of moving to $-1$. However, we can argue that the same analysis applies: in both random walks, you only visit $0$ finitely many times, with probability $1$ (and in the random walk on the integers, with probability $1$ we eventually stay in the positive states forever). So the limit should be $1 - \frac2k$ in your case as well.
Best Answer
One way to get $$ \lim_{n \rightarrow \infty}P_w(X_n \in W | X_{n-1} \in W) = 1 $$ is to take a graph whose random walk Markov chain is transient.
For instance, let $G$ be an infinite binary tree, and let $W$ be one of the branches from the root. Then with probability $1$ the random walk only returns to the root a finite number of times, so in the limit as $n \to \infty$ the probability of the random walk leaving $W$ is $0$.