The one-step analysis at the end of the question is the wrong way around. You need a fixed target, and the index on $E$ should index the states from which you're trying to reach that target.
But in the present case there are less cumbersome ways to get the expectation values you want.
For the first question, the stationary distribution is constant by symmetry, so $\mu_1=4$ is immediate. (Note that this counts staying at $1$ as a "return".)
For the second question, get rid of the self-loops and scale the number of steps by $\frac1{1-\frac14}=\frac43$ to compensate. Then you have a simple walk on the given graph, with equal probabilities $\frac12$ to go either way. Combine the steps into pairs. Each pair has probability $\frac12$ to return to $1$ and $\frac12$ to get you to $4$. Thus the expected number of pairs is the expected number of trials until the first success in a Bernoulli trial with success probability $\frac12$. So the expected number of steps from $1$ to $4$ is
$$\frac43\cdot2\cdot\frac1{\frac12}=\frac{16}3\;.$$
Here's the one-step analysis:
Let $E_n$ be the expected number of steps from state $n$ to state $4$. We're looking for $E_1$. The $E_i$ satisfy
\begin{align}
E_1&=1+\frac14E_1+\frac38E_2+\frac38E_3\;,\\
E_2&=1+\frac14E_2+\frac38E_1+\frac38E_4\;,\\
E_3&=1+\frac14E_3+\frac38E_1+\frac38E_3\;,\\
E_4&=0\;.
\end{align}
(I wrote it out in full since you wanted the general method; in the present case, we could use the fact that $E_2=E_3$ by symmetry to save a variable.) Solving this system of equations yields $E_1=\frac{16}3$, as derived above.
Ok, so from how I learned this, the transition matrix $P$ has column-sums adding up to 1 and I treat $x^{k + 1} = Px^k$, where $k$ is the $k$th step. This means the $P$ that I'm using is the transpose of your $P$, and I'll denote it $P_0$.
To calculate equilibrium solutions, find the eigenvector whose eigenvalue is 1 for $P_0$ and scale it so that its terms sum to 1. That means solving the linear system (You're finding Null($P_0 - \lambda I$) where $\lambda = 1$): $$-0.8x_1 + 0.5x_2 + 0.5x_3 = 0$$ $$0.5x_1 -0.75x_2 + 0.25x_3 = 0$$ $$0.3x_1 + 0.25x_2 -0.75x_3 = 0$$
Then scale your solution so the terms sum up to 1.
Find $x^6 = P_0^6x^0$. Here, $x^0 = \begin{bmatrix} 0 \\ 0 \\ 1\end{bmatrix}$ since the test 5 iterations ago was easy. The first entry of $x^6$ should be the probability that the upcoming exam will be hard.
Best Answer
The Markov chain converges to a unique equilibrium if there is only one recurrent class and it is aperiodic. In this case the directed graph corresponding to the Markov chain looks like this:
The Markov chain is irreducible (i.e. every state communicates with every other state), but it is periodic with period $2$, because from states $1$ and $2$ you can only go to $3$ and $4$ and vice versa. Therefore it does not converge to an equilibrium. In fact, the even powers of the matrix converge to $$ \frac{1}{27} \pmatrix{13 & 14 & 0 & 0\cr 13 & 14 & 0 & 0\cr 0 & 0 & 20 & 7\cr 0 & 0 & 20 & 7\cr}$$ and the odd powers converge to $$ \frac{1}{27} \pmatrix{0 & 0 & 20 & 7\cr 0 & 0 & 20 & 7\cr 13 & 14 & 0 & 0\cr 13 & 14 & 0 & 0\cr}$$