WOLOG, assume $k>0$
(By induction)
When $k = 1$, the expectation is $\mu_0(0) + \frac{1}{2} \mu_0(0) + \frac{1}{2^2} \mu_0(0) + ... = 2$.
It means whenever visits $1$, it has $1/2$ probability to do the cycle again, and $1/2$ probability hits $0$ and ends.
Suppose in $k-1$, the expectation is $2(k-1)$.
Then, since whenever hit $0$, we must hit $1$ first. Thus in $k$, the expectation is $E_{k-1} + \mu_o(k) + \frac{1}{2} \mu_0(k) + \frac{1}{2^2} \mu_0(k) + ... $.
Means after it visit $1$, it has $1/2$ probability to going up and $1/2$ probability to hits $0$ and ends.
As $\mu_0(k) = 1$, we have the expectation is $2k$.
The one-step analysis at the end of the question is the wrong way around. You need a fixed target, and the index on $E$ should index the states from which you're trying to reach that target.
But in the present case there are less cumbersome ways to get the expectation values you want.
For the first question, the stationary distribution is constant by symmetry, so $\mu_1=4$ is immediate. (Note that this counts staying at $1$ as a "return".)
For the second question, get rid of the self-loops and scale the number of steps by $\frac1{1-\frac14}=\frac43$ to compensate. Then you have a simple walk on the given graph, with equal probabilities $\frac12$ to go either way. Combine the steps into pairs. Each pair has probability $\frac12$ to return to $1$ and $\frac12$ to get you to $4$. Thus the expected number of pairs is the expected number of trials until the first success in a Bernoulli trial with success probability $\frac12$. So the expected number of steps from $1$ to $4$ is
$$\frac43\cdot2\cdot\frac1{\frac12}=\frac{16}3\;.$$
Here's the one-step analysis:
Let $E_n$ be the expected number of steps from state $n$ to state $4$. We're looking for $E_1$. The $E_i$ satisfy
\begin{align}
E_1&=1+\frac14E_1+\frac38E_2+\frac38E_3\;,\\
E_2&=1+\frac14E_2+\frac38E_1+\frac38E_4\;,\\
E_3&=1+\frac14E_3+\frac38E_1+\frac38E_3\;,\\
E_4&=0\;.
\end{align}
(I wrote it out in full since you wanted the general method; in the present case, we could use the fact that $E_2=E_3$ by symmetry to save a variable.) Solving this system of equations yields $E_1=\frac{16}3$, as derived above.
Best Answer
Let $m_i$ be the mean number of visits to state $2$ before returning to state $0$, starting from state $i$. We want to determine $m_{0}$. By using a one-step analysis, these $m_{i}$ satisfy
\begin{align} m_{0} &= \frac{2}{3} m_{1} + \frac{1}{3} m_{2}, \\ m_{1} &= \frac{1}{2} \cdot 0 + \frac{1}{2} m_{2}, \\ m_{2} &= 1 + \frac{1}{2} \cdot 0 + \frac{1}{2} m_{1}. \end{align}
The interpretation is as follows. Since this is a Markov chain: if I start in state $0$, with probability $\frac{2}{3}$ I transition to state $1$ and can start the process from there and with probability $\frac{1}{3}$ I transition to state $2$ and can start the process from there. The $\frac{1}{2} \cdot 0$ terms indicate that with probability $\frac{1}{2}$ we have returned to state $0$ and the excursion ends. The $1$ term in the $m_2$ expression counts that the process has spent 1 time unit in state $2$.
You can probably solve this system of equations.