This is not a complete answer to all your questions, but it should provide some insight at the least. I'll give a shot at deriving p.d.f. for the $Y_i$ (distances between consecutive $X_i$). Once we know that, we might be able to use Bayes to find the distribution of $X_i$ given $Y_i$, but I will not address that here.
Given i.i.d. real-valued $X_i$, $i=1,2,\ldots,n$, with probability distribution $f$, one can see that they are equally likely to be in any order, and there are $n!$ possible orders. The probability of a particular ordering is
$$
P(X_{k_1}<X_{k_2}<\cdots < X_{k_n})
=\int_{-\infty}^\infty f(x_{k_1})
\int_{x_{k_1}}^\infty f(x_{k_2})\cdots\int_{x_{k_{n-1}}}^\infty f(x_{k_n}) \ dx_{k_n}\cdots \ d x_{k_2}\ d x_{k_1}.
$$
Switching the order of integration gives
$$
\begin{aligned}
P(X_{k_1}<X_{k_2}<\cdots < X_{k_n})
&=\int_{-\infty}^\infty f(x_{k_n})
\cdots
\int_{-\infty}^{x_{k_3}} f(x_{k_2})
\int_{-\infty}^{x_{k_2}} f(x_{k_1}) \ dx_{k_1}\ d x_{k_2}\cdots \ d x_{k_n}\\
&=\int_{-\infty}^\infty f(x_{k_n})
\cdots
\int_{-\infty}^{x_{k_3}} f(x_{k_2})
F(x_{k_2})\ d x_{k_2}\cdots \ d x_{k_n} \\
&=\int_{-\infty}^\infty f(x_{k_n})
\cdots
\int_{0}^{F(x_{k_3})} u\ d u\cdots \ d x_{k_n} .
\end{aligned}
$$
Note that the substitution $u=F(x)$, $du=F'(x)dx$ has been made, and can be made at each stage. Continuing in the fashion will end up at $\displaystyle \frac{1}{n!}$ for the final value of the integal.
Thus
$$
\begin{aligned}
P(Y_i<x)&=n! P(X_1<X_2<\cdots<X_i<X_{i+1}<X_i+x|X_1<X_2<\cdots<X_n)\\
&\quad\quad\quad\quad\quad\cdot P(X_1<X_2<\cdots<X_n)\\
&=n!P(\{X_1<X_2<\cdots<X_i<X_{i+1}<X_i+x\} \cap \{X_1<X_2<\cdots<X_n\})
\end{aligned}
$$
So we will calculate the probability of the intersection of events $X_1<X_2<\cdots<X_i<X_{i+1}<X_i+x$ and $X_1<X_2<\cdots<X_n$.
We wish to know $F_{Y_i}(x)=P(Y_i<x)=n!P(X_{i+1}-X_i<x)$, thus:
$$
\begin{aligned}
F_{Y_i}(x)=
n!\int_{-\infty}^\infty &f(x_1)\int_{x_1}^\infty f(x_2)\\
&\cdots\int_{x_i}^{x_i+x}f(x_{i+1})\int_{x_{i+1}}^\infty f(x_{i+2})\\
&\quad\quad\quad\quad \cdots\int_{x_{n-1}}^\infty f(x_n) \ dx_n\cdots \ dx_1.
\end{aligned}
$$
This is the c.d.f. of $Y_i$. You can get the p.d.f. by taking the derivative w.r.t. $x$.
This is a complicated integral, and in general, the distribution of $Y_i$ depends on $i$ and $n$. Intuitively, this makes sense as simulating a large number of i.i.d. random variables will result in consecutive values being closer.
For the $X_i$ exponentially distributed with mean $1/\lambda$, and $n=2$, $F_Y(x)=1-e^{-\lambda x}$. The calculation is
$$
F_Y(x)=
2\int_0^\infty \lambda e^{-\lambda x_1}
\int_{x_1}^{x_1+x} \lambda e^{-\lambda x_2} \ dx_2 \ dx_1.
$$
For $n=3$, $F_{Y_2}(x)=1-e^{-\lambda x}$ and $F_{Y_1}(x)=1-e^{-2\lambda x}$. The calculations are
$$
F_{Y_2}(x)=
3!\int_0^\infty \lambda e^{-\lambda x_1}
\int_{x_1}^{\infty} \lambda e^{-\lambda x_2}
\int_{x_2}^{x_2+x} \lambda e^{-\lambda x_3}
\ dx_3 \ dx_2 \ dx_1.
$$
$$
F_{Y_1}(x)=
3!\int_0^\infty \lambda e^{-\lambda x_1}
\int_{x_1}^{x_1+x} \lambda e^{-\lambda x_2}
\int_{x_2}^{\infty} \lambda e^{-\lambda x_3}
\ dx_3 \ dx_2 \ dx_1.
$$
There seems to be a pattern there which could probably be generalized for any $n$ and $i$. I haven't done the analytical calculation, but for any given $n$, $F_{Y_i}(x)=1-e^{-(n-i)\lambda x}$ appears to work. I've confirmed it numerically.
If the $X_i$ are uniformly distributed on the interval $[0,1]$, then $F_{Y_i}(x)=1-(1-x)^n$ for any $i$ and $n$. I performed the analytical calculation for $n=2,3$ and verified the formula numerically. Maybe an induction argument would work to prove it, or there may be some cleaner argument which does requires neither integration nor induction.
Joint density of $(X_1,X_2)$ is $$f_{X_1,X_2}(x_1,x_2)=\frac{1}{2\pi}e^{-\frac{1}{2}(x_1^2+x_2^2)}\quad,(x_1,x_2)\in\mathbb R^2$$
We change variables $(X_1,X_2)\to(Y_1,Y_2)$ such that $Y_1=X_1^2+X_2^2$ and $Y_2=X_2$.
The inverse solutions of the transform are
$$x_1=\begin{cases}\sqrt{y_1-y_2^2}&,\text{ if }x_1\ge0\\-\sqrt{y_1-y_2^2}&,\text{ if }x_1<0\end{cases}\qquad\text{ and }\quad x_2=y_2$$
We have, $(x_1,x_2)\in\mathbb R^2\implies y_1\ge0\,,y_2\in\mathbb R$.
But for $x_1$ to be defined, $y_1-y_2^2\ge0\implies -\sqrt{y_1}\le y_2\le \sqrt{y_1}$.
So the joint support of $(Y_1,Y_2)$ is $$S=\{(y_1,y_2):y_1\ge0\,,\, -\sqrt{y_1}\le y_2\le\sqrt{y_1}\}$$
Clearly, this is not a one-to-one transformation.
Jacobian of the transformation is given by
$$J_1=\det J\left(\frac{x_1,x_2}{y_2,y_2}\right)=\frac{1}{2\sqrt{y_1-y_2^2}}\quad\text{ if }x_1\ge0$$
and $$J_2=\frac{-1}{2\sqrt{y_1-y_2^2}}\quad\text{ if }x_1<0$$
Hence joint density of $(Y_1,Y_2)$ is
\begin{align}
f_{Y_1,Y_2}(y_1,y_2)&=\frac{1}{2\pi}e^{-y_1/2}|J_1|\mathbf1_S+\frac{1}{2\pi}e^{-y_1/2}|J_2|\mathbf1_S
\\&=2\times\frac{1}{2\pi}e^{-y_1/2}\frac{1}{2\sqrt{y_1-y_2^2}}\mathbf1_S
\\&=\frac{1}{2\pi}e^{-y_1/2}\frac{1}{\sqrt{y_1-y_2^2}}\mathbf1_S
\end{align}
For $y_1\ge 0$, marginal pdf of $Y_1$ is thus
\begin{align}
f_{Y_1}(y_1)&=\int_{-\sqrt {y_1}}^{\sqrt {y_1}}f_{Y_1,Y_2}(y_1,v)\,dv
\\&=\frac{e^{-y_1/2}}{2\pi}\sin^{-1}\left(\frac{v}{\sqrt{y_1}}\right)\big|_{-\sqrt {y_1}}^{\sqrt {y_1}}
\\&=\frac{1}{2}e^{-y_1/2}
\end{align}
That is, $$f_{Y_1}(y_1)=\frac{1}{2}e^{-y_1/2}\mathbf1_{y_1\ge 0}$$
So we have shown that $Y_1\sim\text{Exp}$ with mean $2$ or simply, $Y_1\sim\chi^2_2$.
Best Answer
I think a more easy to follow (and simpler) proof would be to use a different change of variables.
We have the joint density of the order statistics $(U_1=X_{(1)},\cdots,U_n=X_{(n)})$
$$f_{\mathbf U}(u_1,\cdots,u_n)=n!\exp\left[-\sum_{i=1}^nu_i+n\theta\right]\mathbf1_{\theta<u_1<u_2<\cdots<u_n}$$
Now transform $(U_1,\cdots,U_n)\to(Y_1,\cdots,Y_n)$ such that $Y_i=(n-i+1)(U_i-U_{i-1})$ for all $i=1,2\cdots,n$ and taking $U_0=\theta$.
It follows that $\sum_{i=1}^nu_i=\sum_{i=1}^ny_i+n\theta$. The jacobian determinant comes out as $n!$.
So you get the joint density of $(Y_1,\cdots,Y_n)$
$$f_{\mathbf Y}(y_1,\cdots,y_n)=\exp\left[-\sum_{i=1}^ny_i\right]\mathbf1_{y_1,\cdots,y_n>0}$$
Not surprisingly, the spacings of successive order statistics from an exponential sample come out as independent . In fact, the $Y_i$'s are i.i.d exponential with mean $1$ for all $i=1,2,\cdots,n$.
This implies $2Y_i\stackrel{\text{i.i.d}}{\sim}\chi^2_2$ for all $i=1,2,\cdots,n$
So we have two independent variables $2Y_1$ and $\sum_{i=2}^n2Y_i$. Both have the chi-square distribution --- the former with $2$ degrees of freedom and the latter with $2n-2$ degrees of freedom.
It is now a matter of time to see that $2Y_1=2n(X_{(1)}-\theta)$ and $2\sum_{i=2}^nY_i=2\sum_{i=2}^n(X_{(i)}-X_{(1)})$.