I think your idea is good. Using the properties of conditional expectation, it could go like this:
\begin{align*}
&P^x ( X_{t_1} \in B_1, X_{t_2} \in B_2, X_{t_3} \in B_3) = E^x \left[ 1_{B_1}(X_{t_1}) \cdot 1_{B_2}(X_{t_2}) \cdot 1_{B_3}(X_{t_3}) \right] \\
&= E^x \left[ 1_{B_1}(X_{t_1}) \cdot E^x \left[ 1_{B_2}(X_{t_2}) \cdot 1_{B_3}(X_{t_3}) | \mathcal{F}_{t_1} \right] \right] \\
&= E^x \left[ 1_{B_1}(X_{t_1}) \cdot E^x \left[ 1_{B_2}(X_{t_2}) \cdot E^x \left[ 1_{B_3}(X_{t_3}) | \mathcal{F}_{t_2} \right] | \mathcal{F}_{t_1} \right] \right] \\
& = E^x \left[ 1_{B_1}(X_{t_1}) \cdot E^x \left[ 1_{B_2}(X_{t_2}) \cdot E^{X_{t_2}} \left[ 1_{B_3}(X_{t_3-t_2}) \right] | \mathcal{F}_{t_1} \right] \right] \\
& = E^x \left[ 1_{B_1}(X_{t_1}) \cdot E^x \left[ 1_{B_2}(X_{t_2}) \cdot \left( \int_{B_3} p_{t_3-t_2} (X_{t_2}, d x_3) \right) | \mathcal{F}_{t_1} \right] \right] \\ & = E^x \left[ 1_{B_1}(X_{t_1}) \cdot E^{X_{t_1}} \left[ 1_{B_2}(X_{t_2-t_1}) \cdot \left( \int_{B_3} p_{t_3-t_2} (X_{t_2-t_1}, d x_3) \right) \right] \right] \\
& = E^x \left[ 1_{B_1}(X_{t_1}) \cdot \left( \int_{B_2} \left( \int_{B_3} p_{t_3-t_2} (x_2, d x_3) \right) p_{t_2-t_1} (X_{t_1}, d x_2) \right) \right] \\
&= \int_{B_1} \left( \int_{B_2} \left( \int_{B_3} p_{t_3-t_2} (x_2, d x_3) \right) p_{t_2-t_1} (x_1, d x_2) \right) p_{t_1} (x, d x_1)
\end{align*}
Chapman-Kolmogorv equations are an equivalent writing, meaning that:
\begin{align*}
& P^x \left( X_{t_1} \in B_1, X_{t_2} \in B_2, X_{t_3} \in B_3 \right) \\
&= \int_{B_1} \left( \int_{B_2} \left( \int_{B_3} p_{t_3-t_2} (x_2, d x_3) \right) p_{t_2-t_1} (x_1, d x_2) \right) p_{t_1} (x, d x_1) \\
& = \int_{B_1 \times B_2 \times B_3} p_{t_1} (x, d x_1) p_{t_2-t_1} (x_1, d x_2) p_{t_3-t_2} (x_2, d x_3)
\end{align*}
Hope it helps!
Remark: we could do the same in dimension n, using recurrence.
The simplest approach is to use the fact that the Brownian bridge $\{W_t,t\in[0,1]\}$ has the same distribution as $\{B_t,t\in[0,1]\}$ conditioned to $B_1 = 0$, which immediately gives the required density. This, however, requires a priori knowledge of the said fact (which is not too hard to prove though).
I will use another approach, which in a sense is reverse to this one. First note that the Brownian bridge $\{W_t = B_t - t B_1\}$ is independent of $B_1$, which can be checked by computing the covariance: in the Gaussian case, zero correlation implies independence.
Therefore,
$$
f_{W_{t_1},\dots, W_{t_p}, B_1}(w_1,\dots,w_p,x) = f_{W_{t_1},\dots,W_{t_p}}(w_1,\dots,w_p)f_{B_1}(x).
$$
Consequently,
$$
f_{W_{t_1},\dots,W_{t_p}}(w_1,\dots,w_p) = \frac{f_{W_{t_1},\dots, W_{t_p}, B_1}(w_1,\dots,w_p,x)}{f_{B_1}(x)}.\tag{1}
$$
Denoting $w_0= t_0 = 0$,
$$
f_{W_{t_1},\dots, W_{t_p}, B_1}(w_1,\dots,w_p,x) = f_{B_{t_1},\dots, B_{t_p}, B_1}(w_1 + t_1 x,\dots,w_p+t_p x,x)\\
= \prod_{i=1}^{p} \frac{1}{\sqrt{2\pi (t_{i} - t_{i-1})}}\exp \Bigl\{-\frac{\big(w_{i} - w_{i-1}+(t_{i}-t_{i-1}) x\big)^2}{2(t_{i}-t_{i-1}) }\Bigr\} \\\times \frac{1}{\sqrt{2\pi (1-t_p)}}\exp \Bigl\{-\frac{\big(x(1-t_p)-w_p\big)^2}{2(1-t_p)}\Bigr\} \\
= \prod_{i=1}^{p} \frac{1}{\sqrt{2\pi(t_{i}-t_{i-1}) }} \exp \Bigl\{- \frac{(w_i-w_{i-1})^2}{2(t_i-t_{i-1})^2} - x(w_i-w_{i-1}) - \frac{x^2(t_i-t_{i-1})}{2}\Bigr\}\\
\times \frac{1}{\sqrt{2\pi(1-t_{p}) }} \exp \Bigl\{-\frac{{w_p}^2}{2(1-t_p)} + w_p x - \frac{x^2(1-t_p)}{2}\Bigr\} \\
= \prod_{i=1}^{p}\frac{1}{\sqrt{2\pi(t_{i}-t_{i-1}) }}\exp \Bigl\{-\frac 12 \frac{(w_i-w_{i-1})^2}{(t_i-t_{i-1})^2}\Bigr\}\\ \times
\frac{1}{\sqrt{2\pi(1-t_{p}) }} \exp \Bigl\{-\frac{{w_p}^2}{2(1-t_p)} - \frac{x^2}{2}\Bigr\}\\
= \prod_{i=1}^{p} p_{t_{i}-t_{i-1}}(w_i-w_{i-1}) \cdot p_{1-t_p}(w_p) e^{-x^2/2}.
$$
Plugging into (1),
$$
f_{W_1,\dots,W_1}(w_1,\dots,w_p) = \sqrt{2\pi}\prod_{i=1}^{p} p_{t_{i}-t_{i-1}}(w_i-w_{i-1}) \cdot p_{1-t_p}(w_p),
$$
as required.
One could simplify the computation: since the ratio in (1) is independent of $x$ and since everything is continuous, one can take $x=0$. By not doing so we have actually shown the independence once more.
Best Answer
Take $X_t=\sqrt{t}B_1$. Then at any fixed point in time $t$, $X_t=B_t$ in distribution (normal distribution with mean 0 and variance $t$), but as processes $X\neq B$ in distribution since $X$ has no independent increments.