First of all, I wonder if $T_j$ refers to the first return time, or to the first hitting time. Since if you're talking about the latter case then $\mathsf P\{T_j = n-k|X_0 = j\} = 1_{\{k=n\}}$.
About continuous time case some insights are the following. I know results only for first hitting time. Let us introduce CTMC (continuous time Markov Chain) through its finite state space $\mathscr X$, transition matrix $R:\mathscr X\times \mathscr X\to[0,1]$ and exit-rate function $r:\mathscr X\to[0,\infty)$. For any initial point $x\in \mathscr X$ and target point $y\in \mathscr X$, the time-bounded reachability probability (or a cumulative distribution function for the first passage time) which we denote as
$$
F(x,y,t) = \mathsf P_x\{\tau_y \leq t\}
$$
is a least solution for
$$
F(x,y,t) = 1_{\{y=x\}}+1_{\{y\neq x\}}\sum\limits_{z\in \mathscr X}R(x,z)\int\limits_0^t\mathrm e^{-r(x)s}F(z,y,t-s)\mathrm ds.
$$
Assuming that $F$ has a density $f$ and recalling that $F(y,y,t) = 1$ we obtain:
$$
f(x,y,t) = R(x,y)\mathrm e^{-r(x)t}+\sum\limits_{z\neq y}R(x,z)\int\limits_0^t\mathrm e^{-r(x)s}f(z,y,t-s)\mathrm ds
$$
for any $x\neq y$.
With regards to the references - the only person whom I've found to work on that topic is he. The equation above I saw in his paper, but I do not remember which one. The problem may be also that his paper are focused on verification of general properties, usually much more complex that just reachability and it may be not so easy to go through the notation. However, I haven't seen something I can suggest you in the classical literature on Probability Theory. If you find it, please tell me.
Regards,
Well, I assume your second row is $[0, 0.5, 0, 0, 0.5]$.
Actually, the non-irreducibility of this Markov chain makes calculation much simpler, and I guess it is intentional.
Here are some hints:
for a), note that state 1 and 4 are communicating, so calculating $\lim_{n\to\infty}p_{11}^{(n)}$ for the whole matrix is same for the matrix:
$$
\hat P _{1,4}=\begin{pmatrix}0.2&0.8\\
0.3&0.7\end{pmatrix}
$$
some linear algebra will bring you to the answer of $3/11$.
for b) note that state 2 and 5 are also communicating, starting from state 2, we will never escape 2 and 5, and the transition matrix is
$$
\hat P _{2,5}=\begin{pmatrix}0.5&0.5\\
0.4&0.6\end{pmatrix}
$$
define the mean recurrence time of state 2 be $T_2$, similarly for $T_5$
then we have
$$
\begin{align}
T_2&=0.5+0.5T_5\\
T_5&=1+0.4T_2+0.6T_5
\end{align}
$$
the first line means that starting from state 2, you have 0.5 chance staying there which gives recurrence time of 1, and 0.5 chance moving to state 5, which results in.mean recurrence time of state 5. Similarly for the second line.
finally, some algebra will give you $T_2=3.5$.
Best Answer
while I think that studying the simple symmetric random walk is the standard example here, I'll explicitly construct a different one.
Consider
$P:=\begin{pmatrix}p_{1,1} & p_{2,2} & p_{3,3} & p_{4,4} & p_{5,5} & p_{6,6} & \dots \\ 1 & 0 & 0 & 0 & 0 & 0 & \dots\\ 0 & 1 & 0 & 0 & 0 & 0 & \dots\\ 0 & 0 & 1 & 0 & 0 & 0 & \dots\\ 0 & 0 & 0 & 1 & 0 & 0 & \dots\\ \vdots & \vdots & \vdots & \vdots & \ddots & \ddots & \ddots\\ \end{pmatrix}$
This chain describes the Residual Life of an integer valued Renewal Process and sometimes goes by names like Residual Waiting Time chain. (The original post uses the term 'Persistent' -- the standard term I think is Recurrent, but it reminds me that Feller uses the term Persistent instead of Recurrent, so in case that OP uses Feller as a reference, this chain shows up in Feller vol 1, 3rd edition, page 381, and elsewhere subsequently in that chapter.)
Given its simple structure, we can easily construct a case where this chain is null recurrent.
1.) For simplicity we'll require each $p_{i,i}\gt 0$ and by inspection, there is a single communicating class -- i.e. state 1 may reach any state with positive probability and state $j\geq 2$ may reach state 1 with positive probability (in fact it reaches state 1 deterministically in exactly $j-1$ iterations).
2.) To verify recurrence, it we need to ensure that given a start in state one, we return with probability 1.
3.) To be null recurrent, we need to ensure that given a start in state one, the expected time until revisiting state one is $\infty$.
the finish
Set $p_{i,i}:=\frac{1}{i(i+1)}$
noting that
$\sum_{i=1}^{n-1}\frac{1}{i(i+1)}=1-\frac{1}{n}$
if you don't know this result -- it's a simple and worthwhile telescoping exercise.
thus
$\sum_{i=1}^{\infty}\frac{1}{i(i+1)} = 1$
so the chain is recurrent, but
$\sum_{i=1}^{\infty}i\cdot\frac{1}{i(i+1)} = \sum_{i=1}^{\infty}\frac{1}{i+1} = \infty$
because the harmonic series is divergent. Thus this is an example of a null recurrent chain.