Obviously if $x=0$ then the chain marches off to infinity, so we'll suppose that $x>0$. Then all the states have the same recurrence status because all the states communicate.
Let's suppose first that the chain is positive recurrent. To find the stationary distribution $(\pi_i)_{i \in \mathbb{N}}$, note that the probability we are in state $i \not = 1$ at time $n$ is just
$$\left( \frac{i-1}{i} \right)^x \mathbb{P}(\text{we are in state $i-1$ at time $n-1$})$$
That is, $\pi_i = \left(\frac{i-1}{i} \right)^x \pi_{i-1}$, so $\pi_i = \left( \frac{1}{i} \right)^x \pi_1$ for $i > 1$.
Since $\pi$ is a distribution, we have $$\pi_1 \left(1+\sum_{i=2}^{\infty} \left( \frac{1}{i} \right) ^x \right) = 1$$
This sum never converges for any $x \leq 1$, so in fact there can't be a stationary distribution. Therefore the chain can't be positive recurrent if $x \leq 1$. If, on the other hand, $x > 1$, then the sum does converge, so if $x>1$ and the chain is positive recurrent, then we've found the stationary distribution.
Is the chain transient or null-recurrent? We examine the state $1$. The probability that "after some time we stop ever hitting $1$" is just $\left(\frac{1}{2} \right)^x \times \left( \frac{2}{3} \right)^x \times \dots$, which of course converges to 0. Therefore the chain is recurrent no matter what $x > 0$ is.
Hence if $x=0$, the chain is transient. If $0 < x \leq 1$, the chain is null-recurrent. If $x > 1$ then it's probably positive-recurrent. I'm still working on proving that the chain actually does tend to the stationary distribution above; with that done, the result is proved.
In a Markov chain with finitely many states, all recurrent states are positive-recurrent. So in your example, the state $\{ 3 \}$ is positive-recurrent.
A null-recurrent state is a state the Markov chain will almost-surely return to, but the average of the return time will be infinite. So if you have a recurrent state $i$ in a finite Markov chain, there will only be finitely many states the chain can bounce between before it returns to $i$. Intuitively, the average time spent in those finite states, before the chain returns to $i$, cannot be infinite.
You can also check some of the related questions. Hope this helps!
Best Answer
So here's one approach: let $N_{n}$ denote the size of your population at time $n$ without the additional condition that $p(0,1)=1$ and let $(X_{i,j})_{1\leq{i,j}<\infty}$ be i.i.d. random variables distributed according to your offspring distribution. That is, $\mathbb{P}(X_{1,1}=k)=p_{k}\hspace{2pt}$ for all nonnegative integers $k$. Then:
$$\mathbb{E}N_{n+1}=\mathbb{E}(\sum_{k=1}^{N_n}X_{n+1,k})=\mathbb{E}N_{n}\mathbb{E}X_{n+1,1}=\mu\mathbb{E}(N_{n})$$
The second equality here follows from the fact that all of the $X_{n+1,k}$'s are independent of $N_{n}$. From this recurrence, we deduce that $\mathbb{E}N_{n}=N_{0}\mu^{n}$. Thus,
$$\mathbb{E}N_{n}=N_{0}\mu^{n}=\sum_{k=1}^{\infty}k\mathbb{P}(N_{n}=k)\geq{\sum_{k=1}^{\infty}\mathbb{P}(N_{n}=k)}=1-\mathbb{P}(N_{n}=0)$$
Rearranging, this implies that:
$$\mathbb{P}(N_{n}=0)\geq{1-N_{0}\mu^{n}}$$
Let $\tau_{0}$ denote the hitting time of $0$ by $N_{n}$. Then $\mathbb{P}(N_{n}=0)=\mathbb{P}(\tau_{0}\leq{n})$ since once our population dies, it stays dead. Thus, it follows that:
$$\mathbb{P}(\tau_{0}>n)\leq{N_{0}\mu^{n}}$$
From here, since $\tau_{0}$ is a nonnegative random variable, the layer- cake representation tells us that:
$$\mathbb{E}\tau_{0}=\sum_{k=1}^{\infty}\mathbb{P}(\tau_{0}\geq{k})\leq{1+\sum_{k=1}^{\infty}\mathbb{P}(\tau_{0}>k)}\leq{1+\sum_{k=1}^{\infty}N_{0}\mu^{k}}=1+\frac{N_{0}\mu}{1-\mu}<\infty$$
The summability of this last geometric series is where we made use of the fact that $\mu<1$. Notice now that since your Markov chain behaves the same way as the original branching process up until the point we hit $0$, $\tau_{0}$ is unchanged when we add on the condition that $p(0,1)=1$. In summa, we see that $\mathbb{E}\tau_{0}<\infty$ as desired and so the resulting Markov process is positive recurrent.