You are confusing the interarrival times $(T_n)$ with the sum $Y_t=K_t+L_t$ of the time $K_t=T_{N_t+1}-t$ till the next event and the time $L_t=t-T_{N_t}$ since the last event.
It is a general fact that, for i.i.d. interarrival times $(T_n)$ distributed like $T$, the length $Y_t=K_t+L_t$ of the interval around $t$ is distributed like $\min\{T^*,t\}$, where the distribution $T^*$ is the size-biased distribution of $T$. Furthermore, conditionally on $Y_t$, $K_t$ and $L_t$ are uniformly distributed in the interval $(0,Y_t)$, that is, $K_t=U_tY_t$ and $L_t=(1-U_t)Y_t$, where $U_t$ is uniform on $(0,1)$ and independent of $Y_t$.
Recall that, starting from an almost surely nonnegative integrable random variable $X$, its size-biased distribution is the distribution of a random variable $X^*$ such that, for every bounded measurable function $u$,
$$
E(u(X^*))=\frac{E(Xu(X))}{E(X)}.
$$
If $X$ has density $f$ and mean $m$, $X^*$ has density $f^*$ defined by $f^*(x)=m^{-1}xf(x)$ for every $x\gt0$.
When $T$ is exponential $\lambda$, $T^*$ is gamma $(2,\lambda)$ and, if $U$ is uniform on $(0,1)$ and independent of $T^*$, then $UT^*$ is again exponential $\lambda$. Thus, in the homogenous renewal process, $K_t$ is exponential $\lambda$, $L_t$ is distributed like $\min\{t,L\}$ where $L$ is exponential $\lambda$, and $Y_t$ is distributed like $\min\{t,Y\}$, where $Y$ is gamma $(2,\lambda)$.
Let $X_1,\ldots,X_n$ be independent with $\mathrm{Expo}(\lambda)$ distribution. We compute the density of $X_1+X_2$ by convolution:
\begin{align}
f_{X_1+X_2}(t) &= f_{X_1}\star f_{X_2}(t)\\
&= \int_{\mathbb R} f_{X_1}(\tau)f_{X_2}(t-\tau)\ \mathsf d\tau\\
&= \int_0^t \lambda e^{-\lambda \tau}\lambda e^{-\lambda(t-\tau)}\ \mathsf d\tau\\
&= \lambda^2 e^{-\lambda t} \int_0^t\ \mathsf dt\\
&= \lambda ( \lambda t)e^{-\lambda t}\mathsf 1_{(0,\infty)}(t).
\end{align}
Assuming now that $S_n:=\sum_{k=1}^n X_k$ has density $\frac{\lambda (\lambda t)^{n-1} e^{-\lambda t}}{(n-1)!}$, we compute the density of $S_{n+1} = S_n + X_{n+1}$ again by convolution:
\begin{align}
f_{S_{n+1}}(t) &= f_{S_n}\star f_{X_{n+1}}(t)\\
&= \int_0^t \frac{\lambda (\lambda \tau)^{n-1} e^{-\lambda \tau}}{(n-1)!}\lambda e^{-\lambda (t-\tau)}\ \mathsf d\tau\\
&=\frac{\lambda^2e^{-\lambda t}}{(n-1)!}\int_0^t (\lambda\tau)^{n-1} \ \mathsf d\tau\\
&= \frac{\lambda^2e^{-\lambda t}}{(n-1)!} \frac{(\lambda t)^n}{\lambda n}\\
&= \frac{\lambda(\lambda t)^n e^{-\lambda t}}{n!},
\end{align}
so the density is of this form for all positive integers $n$ by mathematical induction.
Best Answer
Let $\mu$ be the reference measure of the Poisson process. By definition, this means that the number of arrivals falling into any $n$ mutually disjoint sets $B_1,\ldots,B_n$ has the same distribution as independent Poisson random variables with means $\mu(B_1),\ldots,\mu(B_n)$.
Specializing to the case of a Poisson process on $[0,\infty)$ (so that the interarrival times are well-defined) we see that if $T$ denotes the first arrival, then $$\mathbb P(T>t)=\mathbb P(0\text{ arrivals in }[0,t])=e^{-\mu([0,t])},$$ where the last equality uses the probability of a Poisson variable with mean $\mu([0,t])$ being zero.
You can see from this that if $\mu$ is not a constant multiple of Lebesgue measure, then $T$ is no longer an exponential random variable.
The interarrival times are no longer independent in general, since if $T'$ denotes the next interarrival time after $T$ then $$ \mathbb P(T'>t\mid T)=e^{-\mu([T,T+t])}. $$ Whenever $\mu$ is not a constant multiple of Lebesgue measure, this conditional probability depends on $T$, and thus $(T,T')$ are not independent.
The joint distribution of $(T,T')$ satisfies $$ \mathbb P(T'>t,T>s)=\mathbb E[e^{-\mu([T,T+t])};T>s], $$ you can not go too much farther without knowing more about the reference measure $\mu$.
Kallenberg's textbook is a good reference for Poisson process questions.