First question. Denoting by $({\cal F}_n)$ the filtration generated by the process $(X_n)$ we will prove that the event $A:=\{W_y= 1\}$ does not belong to the $\sigma$-field ${\cal F}_1$. This implies that $W_y$ is not a stopping time. One has $A=\{X_1 \neq y\} \cap \{X_2=y\}$. If $A$ did belong to ${\cal F}_1$ then one would have $\Pr(A \mid {\cal F}_1)={\boldsymbol 1}_A$. But one has $\Pr(A \mid {\cal F}_1) = p(X_1,y){\boldsymbol 1}_{X_1 \neq y}$.
Second question. The process $(X_{W+k})_{k \geq 0}$ is Markovian but does not share the same Markov transition as the process $(X_n)$. Indeed, putting $Y_k=X_{W+k}$, the conditional distribution of $Y_1$ given $Y_0$ is the Dirac distribution at $y$. Hence the strong Markov property of $(X_n)$ does not apply for the random time $W$.
(A) Find the Stationary Distribution
First, let's make some observations about this DTMC. Firstly, if we draw this out, we can immediately see that it has period 2, and therefore will have no limiting distribution. Due to this, we can't assume that there exists a stationary distribution. Noting this, we also can't do any fancy numerical tricks such as $P^{10000}$ to find the stationary distribution, so we will need to find these the old fashioned way.
Nothing the method whuber states, let's start with small $k$ and look for patterns. Starting with $k = 1$, the stationary distribution is trivial
$$\pi = \left[ \frac{1}{2} \quad \frac{1}{2}\right]$$
Moving on to $k = 2$, we can solve $\pi P = \pi$ by reducing the following matrix, also considering the additional qualification that $\sum_{i=1}^n \pi_i = 1$,
$$P = \begin{bmatrix}
-1 & .5 & 0 & 0\\
1 & -1 & 1 & 0\\
1 & 1 & 1 & 1
\end{bmatrix} \implies \pi = \left[\frac{1}{4} \quad \frac{1}{2} \quad \frac{1}{4}\right]$$
Moving on to $k = 3$, solving a similar matrix leads to
$$\pi = \left[\frac{1}{6} \quad \frac{1}{3} \quad \frac{1}{3} \quad \frac{1}{6}\right]$$
and $k = 4$ gives
$$\pi = \left[\frac{1}{8} \quad \frac{1}{4} \quad \frac{1}{4} \quad \frac{1}{4} \quad \frac{1}{8} \right]$$
At this point, a pattern is clearly formed. The stationary distribution can be summarized by
$$\pi = \begin{cases}
\frac{1}{2k}, & \text{for }k = 0, n\\
\frac{1}{k}, & \text{for }0 < k < n
\end{cases}$$
This can be tested with the time reversible equations where we must show
$$\pi_i p_{i,j} = \pi_j p_{j,i} \quad \forall \quad i,j$$
The intermediate probabilities are trivial, so we will show specifically that $\pi_0 p_{0,1} = \pi_1 p_{1,0}$ as it is clearly equal to solving $\pi_{n-1} p_{n-1,n} = \pi_n p_{n,n-1}$.
\begin{align*}
\pi_0 p_{0,1} &= \frac{1}{2k} * 1\\
&= \frac{1}{2} * \frac{1}{k}\\
&= \frac{1}{k} * \frac{1}{2}\\
&= \pi_1 p_{1,0}
\end{align*}
and as this solution satisfies our time reversibly equations, it is indeed the stationary distribution.
(B) Expected Walks to return to $P_0$
If you haven't read it yet, understand that the expected time between visits is a geometric problem, and it reduces simply to $$E[\text{Return to }P_0] = \dfrac{1}{\pi_0} = \dfrac{1}{\frac{1}{2000}} = 2000$$
Let me know if that made sense or if you have any questions.
Best Answer
This looks like homework so I'm trying to give a hint, not a solution.
For part (b), you definitely want to use the structure of the graph. Without loss of generality suppose you start at $12$ and your first step is to $1$. Can you say what the probability is that you hit $11$ before you hit $12$?