Let $\mu$ be the reference measure of the Poisson process. By definition, this means that the number of arrivals falling into any $n$ mutually disjoint sets $B_1,\ldots,B_n$ has the same distribution as independent Poisson random variables with means $\mu(B_1),\ldots,\mu(B_n)$.
Specializing to the case of a Poisson process on $[0,\infty)$ (so that the interarrival times are well-defined) we see that if $T$ denotes the first arrival, then
$$\mathbb P(T>t)=\mathbb P(0\text{ arrivals in }[0,t])=e^{-\mu([0,t])},$$
where the last equality uses the probability of a Poisson variable with mean $\mu([0,t])$ being zero.
You can see from this that if $\mu$ is not a constant multiple of Lebesgue measure, then $T$ is no longer an exponential random variable.
The interarrival times are no longer independent in general, since if $T'$ denotes the next interarrival time after $T$ then
$$
\mathbb P(T'>t\mid T)=e^{-\mu([T,T+t])}.
$$
Whenever $\mu$ is not a constant multiple of Lebesgue measure, this conditional probability depends on $T$, and thus $(T,T')$ are not independent.
The joint distribution of $(T,T')$ satisfies
$$
\mathbb P(T'>t,T>s)=\mathbb E[e^{-\mu([T,T+t])};T>s],
$$
you can not go too much farther without knowing more about the reference measure $\mu$.
Kallenberg's textbook is a good reference for Poisson process questions.
Based on @user8675309's comment, we have that conditioned on $\{x>t\}$, the complementary distribution function of $L_t$ is proportional to the complementary distribution function of $Y_t$, and so the derivative is proportional to the exponential density with parameter $\lambda$. In other words, for $x>t$ we have that $f_{L_t}(x)\propto f_{Y_t}(x) = \lambda e^{-\lambda x}$, so $f_{L_t}(x) = C\lambda e^{-\lambda x}$ for some $C>0$. Integrating the density of $L_t$ from $0$ to $t$ yields
$$
\int_0^t f_{L_t}(x) \ \mathsf dx = \int_0^t\lambda(\lambda x)e^{-\lambda x}\ \mathsf dx = 1-(1+\lambda t)e^{-\lambda t} ,
$$
so for the density to integrate to $1$ over $(0,\infty)$ we must have
$$
\int_t^\infty C\lambda e^{-\lambda x}\ \mathsf dx = (1+\lambda t)e^{-\lambda t}.
$$
But $$\int_t^\infty C\lambda e^{-\lambda x}\ \mathsf dx = Ce^{-\lambda t}, $$
so it follows that $C=1+\lambda t$. Hence the density of $L_t$, conditioned on $\{x>t\}$, is $f_{L_t}(x) = \lambda(1+\lambda t)e^{-\lambda x}$.
Best Answer
It is a general fact that, for i.i.d. interarrival times $(T_n)$ distributed like $T$, the length $Y_t=K_t+L_t$ of the interval around $t$ is distributed like $\min\{T^*,t\}$, where the distribution $T^*$ is the size-biased distribution of $T$. Furthermore, conditionally on $Y_t$, $K_t$ and $L_t$ are uniformly distributed in the interval $(0,Y_t)$, that is, $K_t=U_tY_t$ and $L_t=(1-U_t)Y_t$, where $U_t$ is uniform on $(0,1)$ and independent of $Y_t$.
Recall that, starting from an almost surely nonnegative integrable random variable $X$, its size-biased distribution is the distribution of a random variable $X^*$ such that, for every bounded measurable function $u$, $$ E(u(X^*))=\frac{E(Xu(X))}{E(X)}. $$ If $X$ has density $f$ and mean $m$, $X^*$ has density $f^*$ defined by $f^*(x)=m^{-1}xf(x)$ for every $x\gt0$.
When $T$ is exponential $\lambda$, $T^*$ is gamma $(2,\lambda)$ and, if $U$ is uniform on $(0,1)$ and independent of $T^*$, then $UT^*$ is again exponential $\lambda$. Thus, in the homogenous renewal process, $K_t$ is exponential $\lambda$, $L_t$ is distributed like $\min\{t,L\}$ where $L$ is exponential $\lambda$, and $Y_t$ is distributed like $\min\{t,Y\}$, where $Y$ is gamma $(2,\lambda)$.