To understand what is required to prove in part (c) it is firstly impotrant to use the same notations as in the body of the question. Let $\pi'=(\pi'(0,0),\,\pi'(0,1),\, \pi'(1,0),\,\pi'(1,1))$ is a stationary distribution for $Y_n$.
(b) You solved $\pi'=P'\pi'$, this is not the system of equations for determine stationary distribution. For any stochastic matrices this system of equations gives the vector with the same coordinates.
If $\pi'$ is an initial distribution, the distribution on the next step is $\pi'\cdot P'$, so the equality $\pi'=\pi' P'$ defines stationary distribution. It is
$$
(\pi'(0,0),\,\pi'(0,1),\, \pi'(1,0),\,\pi'(1,1)) = \left(0.1,\,0.3,\,0.3,\,0.3\right).
$$
Return to (c). You need firstly to find stationary distribution $\pi=(\pi(0),\pi(1))$ for initial chain $X_n$ by $\pi=\pi P$, and check whether $\pi'(i,j)=\pi(i) P_{i,j}$ that is
$$
0.1=\pi'(0,0) = \pi(0)\cdot P_{0,0}=\pi(0)\cdot\frac14,
$$
$$
0.3=\pi'(0,1) = \pi(0)\cdot P_{0,1}=\pi(0)\cdot \frac34,
$$
$$
0.3=\pi'(1,0) = \pi(1)\cdot P_{1,0}=\pi(1)\cdot \frac12
$$
and
$$
0.3=\pi'(1,1) = \pi(1)\cdot P_{1,1}=\pi(1)\cdot \frac12.
$$
And then there are two possibilities for understand what generalizations are needed in part (c).
Either you generalize it for arbitrary transition matrix $P=\pmatrix{a & 1-a\\ 1-b & b}$ on state space $S=\{0,1\}$ and repeat all the steps from the beginning: write $P'$, find stationary distribution $\pi'$ for it, find stationary distribution $\pi$ and check whether $\pi'(i,j)=\pi(i) P_{i,j}$ for all $i,j\in\{0,1\}$,
or (which looks more probable for me), suppose that $S$ is arbitrary finite state space, $P$ is a transition matrix for $X_n$ for which there exists stationary distribution $\pi$, $Y_n=(X_n,X_{n+1})$ is a MC on $S\times S$, then stationary distribution $\pi'$ satisfies $\pi'(i,j)=\pi(i) P_{i,j}$ for all $i,j\in S$.
For the last case, there is no need to write $P'$. Stationary distribution satisfies the property: if the chain starts from stationary distribution, it stays in stationary distribution for any step. So the only thing you need to check is :
Let $\mathbb P(X_0=i,X_1=j)=\pi'(i,j)=\mathbb P(X_1=i,X_2=j)$ for all $i,j$ then $\pi'(i,j)=\pi(i) P_{i,j}$.
And this can easily be checked:
$$
\pi'(i,j)=\mathbb P(X_0=i,X_1=j)= \mathbb P(X_0=i)\mathbb P(X_1=j\mid X_0=i) = \mathbb P(X_0=i) P_{i,j}
$$
$$
\pi'(i,j)=\mathbb P(X_1=i,X_2=j)= \mathbb P(X_1=i)\mathbb P(X_2=j\mid X_1=i)=P(X_1=i) P_{i,j}
$$
and these probabilities coincide iff $P(X_0=i)=P(X_1=i)$ for all $i$, so $X_n$ works in stationary distribution $P(X_n=i)=\pi(i)$. And then
$$
\pi'(i,j) = \mathbb P(X_0=i) P_{i,j} = \pi(i)P_{i,j}.
$$
Best Answer
First consider that the limit means something more than just a formalization so you have a set of equations on it, meaning: $$ P_\infty^T T = P_\infty^T $$ ans T must be a valid transition probability matrix (each column must sum to 1).
so these are the required conditions that I know of. Plus the known form one can solve the problem with some degree of freedom or declare that there is no solution.