Double check the definition of the period. First off, a period always refers to a particular state $i$. Specifically it's the GCD of all times $k$ for which a return is possible. So figuring out the "shortest time of return" is not sufficient.
Next, a state $i$ is aperiodic if its period is 1. A Markov Chain is aperiodic if all states have period 1.
In your example, it's possible to start at 0 and return to 0 in 2 or 3 steps, therefore 0 has period 1. Similarly, 1 and 2 also have period 1. So the Markov chain is aperiodic.
Yes it is appropriate to consider the the state transition matrix
$$\begin{bmatrix}
\frac{2}{3} & \frac{1}{3} & 0 & 0 & \dots \\
\frac{2}{3} & 0 & \frac{1}{3} & 0 & \dots \\
\frac{2}{3} & 0 & 0 & \frac{1}{3} & \dots \\
\vdots & \vdots & \vdots & \vdots & \ddots
\end{bmatrix}
$$
since the original chain gets in this chain in two steps. So we can forget about the first two states, 1 and 2, $P_1$ and $P_2$ will be $0$ on the long run because the chain will never get (back) to these states.
In the case of the transition matrix above, it is easy to calculate the stationary probabilities:
So, we have the stationary probabilities in a row vector $\pi=[P_3\ P_4 \ P_5\cdots]$ and we have the equation for the stationary probabilities
$$[P_3\ P_4 \ P_5\cdots]\begin{bmatrix}
\frac{2}{3} & \frac{1}{3} & 0 & 0 & \dots \\
\frac{2}{3} & 0 & \frac{1}{3} & 0 & \dots \\
\frac{2}{3} & 0 & 0 & \frac{1}{3} & \dots \\
\vdots & \vdots & \vdots & \vdots & \ddots
\end{bmatrix}=[P_3\ P_4 \ P_5\cdots].$$
In the case of the row vector and the firs column we get
$$\frac23(P_3+P_4+\cdots)=\frac23=P_3.$$
Then in the case of the row vector and the second column we have
$$P_3\frac13=\frac29=P_4$$
and so on.
So we have for the row vector of the stationary probabilities
$$\left[0 \ 0\ \frac23\ \ \frac29\ \ \frac2{27}\ \cdots\right].$$
Here $P_1$ and $P_2$ were included.
If I understood the question well then
$$\lim_{n\to\infty} p^n(4,7)=\frac2{3^5}=\pi(7)$$
because
$$\lim_{n\to\infty} \begin{bmatrix}
0&1&0&0&\cdots\\
0&0&1&0&\cdots\\
\frac{2}{3} & \frac{1}{3} & 0 & 0 & \dots \\
\frac{2}{3} & 0 & \frac{1}{3} & 0 & \dots \\
\frac{2}{3} & 0 & 0 & \frac{1}{3} & \dots \\
\vdots & \vdots & \vdots & \vdots & \ddots
\end{bmatrix}^n
=\begin{bmatrix}
0&0&\frac23&\frac2{3^2}&\frac2{3^3}&\frac2{3^4}&\frac2{3^5}\cdots\\
0&0&\frac23&\frac2{3^2}&\frac2{3^3}&\frac2{3^4}&\frac2{3^5}\cdots\\
0&0&\frac23&\frac2{3^2}&\frac2{3^3}&\frac2{3^4}&\frac2{3^5}\cdots\\
0&0&\frac23&\frac2{3^2}&\frac2{3^3}&\frac2{3^4}&\boxed{\color{red}{\frac2{3^5}}}\cdots\\
\vdots & \vdots & \vdots & \vdots & \vdots& \vdots & \ddots
\end{bmatrix}.$$
Best Answer
Some hints for how to proceed:
(a) I don't think induction will be all that necessary or helpful here. Fix a state $x$; the $d$ classes will correspond to the periodicity of the original chain. That is, one class will be $\{y \colon p^{nd}(x,y) > 0$ for some $n\}$, another will be $\{y \colon p^{nd +1}(x,y) > 0$ for some $n\}$, then $\{y \colon p^{nd+2}(x,y) > 0$ for some $n\}$, etc. Your task is now to prove that this does indeed give an equivalence relation and that these classes are actually distinct.
(b) This should come straight from the definition of periodicity (or aperiodicity) of a Markov Chain.
(c) Use the Basic Theorem for Markov Chains on the chain $\{Y_n\}$; then, use the fact that one "step" in the $\{Y_n\}$ chain is equivalent to $d$ steps on the $\{X_n\}$ chain.