You can't. Here are two possible joint distributions which both yield the correct marginal distributions from your question, but which are different.
$$\begin{array}{l|l|l|}
& \mathbf{s}=s_1 & \mathbf{s}=s_2 \\
\hline
(\mathbf{u},\mathbf{v})=(x_1,x_1) & \frac{2}{6} & \frac{1}{6} \\
(\mathbf{u},\mathbf{v})=(x_1,x_2) & \frac{2}{6} & \frac{1}{6} \\
(\mathbf{u},\mathbf{v})=(x_2,x_1) & \frac{1}{6} & \frac{2}{6} \\
(\mathbf{u},\mathbf{v})=(x_2,x_2) & \frac{1}{6} & \frac{2}{6} \\
\end{array}$$
$$\begin{array}{l|l|l|}
& \mathbf{s}=s_1 & \mathbf{s}=s_2 \\
\hline
(\mathbf{u},\mathbf{v})=(x_1,x_1) & \frac{1}{6} & \frac{2}{6} \\
(\mathbf{u},\mathbf{v})=(x_1,x_2) & \frac{3}{6} & 0 \\
(\mathbf{u},\mathbf{v})=(x_2,x_1) & \frac{2}{6} & \frac{1}{6} \\
(\mathbf{u},\mathbf{v})=(x_2,x_2) & 0 & \frac{3}{6} \\
\end{array}$$
(Note that I renamed your random variable $\mathbf{x}_1$ to $\mathbf{u}$ and $\mathbf{x}_2$ to $\mathbf{v}$. Using the same name for random variable and for the elements of their image, i.e. $X = (x_1,x_2)$ seemed confusing)
Originally we have two different spaces
Let $(\Omega_1, F_1, P_1)$ and $(\Omega_2, F_2, P_2)$ be two probability spaces. That is, $\Omega_1$ and $\Omega_2$ are nonempty sets, $F_1$ is a sigma-algebra on $\Omega_1$, $F_2$ is a sigma-algebra on $\Omega_2$, and $P_1$ and $P_2$ are functions
\begin{align*}
P_1: F_1 \rightarrow\mathbb{R}\\
P_2:F_2 \rightarrow \mathbb{R}
\end{align*}
that satisfy the 3 probability axioms with respect to $(\Omega_1, F_1)$ and $(\Omega_2, F_2)$, respectively. Let
\begin{align}
X_1:\Omega_1 \rightarrow\mathbb{R}\\
X_2:\Omega_2 \rightarrow\mathbb{R}
\end{align}
be functions such that $X_1$ is measurable with respect to $(\Omega_1, F_1)$ and $X_2$ is measurable with respect to $(\Omega_2, F_2)$.
Defining a single new space
Define
$$\Omega = \Omega_1 \times \Omega_2 = \{(\omega_1, \omega_2) : \omega_1 \in \Omega_1, \omega_2 \in \Omega_2\}$$
Also define $F$ as the smallest sigma-algebra on $\Omega$ that contains all sets of the form $A_1 \times A_2$ such that $A_1 \in F_1$, $A_2 \in F_2$. (Note 1: Here we define $\phi \times A_2=A_1\times \phi=\phi$. Note 2: $F \neq F_1 \times F_2$, see example below).
Fundamental question
Recall that $\Omega =\Omega_1 \times \Omega_2$. Does there exist a function $P:F\rightarrow\mathbb{R}$
that satisfies
$$P[A_1 \times A_2] = P_1[A_1]P_2[A_2] \quad \forall A_1 \in F_1, \forall A_2 \in F_2 \quad (*)$$
and that also satisfies the three axioms of probability with respect to $(\Omega, F)$?
This is a deep and hard question, the answer is not obvious. Fortunately, the answer is "yes." Further, the function is unique. This is due to the Hahn-Kolmogorov theorem:
https://en.wikipedia.org/wiki/Product_measure
Consequence of "yes"
Once we have such a function $P:F\rightarrow\mathbb{R}$, we have a legitimate new probability space $(\Omega, F, P)$. We can define new functions $X_1^{new}:\Omega\rightarrow\mathbb{R}$ and $X_2^{new}:\Omega\rightarrow\mathbb{R}$ by
\begin{align}
X_1^{new}(\omega_1, \omega_2) &= X_1(\omega_1) \quad \forall (\omega_1, \omega_2) \in \Omega \\
X_2^{new}(\omega_1, \omega_2) &= X_2(\omega_2)\quad \forall (\omega_1, \omega_2) \in \Omega
\end{align}
It can be shown that $X_1^{new}$ and $X_2^{new}$ are both measurable with respect to $(\Omega, F, P)$. Thus, they can be called random variables with respect to $(\Omega, F, P)$.
We can prove that $X_1^{new}$ and $X_2^{new}$ are independent: Fix $x_1, x_2 \in \mathbb{R}$. Define
\begin{align}
A_1 &= \{\omega_1 \in \Omega_1 : X_1(\omega_1) \leq x_1\}\\
A_2 &=\{\omega_2 \in \Omega_2 : X_2(\omega_2) \leq x_2\}
\end{align}
Then
\begin{align}
&P[X_1^{new} \leq x_1, X_2^{new}\leq x_2] \\
&=P\left[\{\omega \in \Omega: X_1^{new}(\omega) \leq x_1\}\cap \{\omega \in \Omega: X_2^{new}(\omega) \leq x_2\}\right]\\
&= P\left[\{(\omega_1, \omega_2)\in \Omega : X_1(\omega_1)\leq x_1, X_2(\omega_2) \leq x_2\} \right] \\
&= P\left[ A_1 \times A_2 \right]\\
&\overset{(a)}{=} P_1[A_1]P_2[A_2]\\
&\overset{(b)}{=} \left(P_1[A_1]P_2[\Omega_2]\right)\left( P_1[\Omega_1]P_2[A_2]\right)\\
&\overset{(c)}{=} P[A_1 \times \Omega_2]P[\Omega_1 \times A_2]\\
&=P[X_1^{new} \leq x_1]P[X_2^{new}\leq x_2]
\end{align}
where (a) and (c) hold by the property (*) of the $P$ function; (b) holds because $P_1[\Omega_1]=1$ and $P_2[\Omega_2]=1$. This holds for all $x_1,x_2 \in \mathbb{R}$. Thus, $X_1^{new}$ and $X_2^{new}$ are independent.
Example to show $F\neq F_1 \times F_2$.
Define
\begin{align}
\Omega_1 &= \{1,2,3\}\\
\Omega_2 &= \{a,b,c\} \\
\Omega &= \Omega_1 \times \Omega_2
\end{align}
Define $F_1$ and $F_2$ as the power sets of $\Omega_1$ and $\Omega_2$, respectively
\begin{align}
F_1 &= \{\phi, \{1\}, \{2\}, \{3\}, \{1,2\}, \{1,3\}, \{2,3\}, \{1,2,3\}\}\\
F_2 &= \{\phi, \{a\}, \{b\}, \{c\}, \{a,b\}, \{a,c\}, \{b,c\}, \{a,b,c\}\}
\end{align}
It can be shown that $F$ is the power set of $\Omega$. Thus
So $F$ has more elements than $F_1 \times F_2$. The structure of the set $F_1 \times F_2$ is also different from that of $F$:
Elements of $F_1 \times F_2$ include $(\phi, \{a\})$ and $(\phi, \{b\})$ and $(\{1\}, \{a\})$ and $(\{2\}, \{b\})$.
Elements of $F$ include $\phi$ and $\{(1,a), (2,b)\}$.
Caveat 1
The set $F$ is sometimes called $F_1 \otimes F_2$. This is quite different from $F_1 \times F_2$, and also different from $\sigma(F_1 \times F_2)$.
Caveat 2
As in my above comments on the question, usually we do not concern ourselves with this deep extension theory.
If we have a probability experiment that involves random variables $Y$ and $Z$, we implicitly assume there is a single probability space $(\Omega, F, P)$ and $Y:\Omega\rightarrow\mathbb{R}$ and $Z:\Omega\rightarrow\mathbb{R}$ are measurable functions on this space.
Thus, for all $y,z \in \mathbb{R}$ we know that $\{Y \leq y\} \in F$ and $\{Z \leq z\} \in F$. Since $F$ is a sigma-algebra, this implies that $\{Y \leq y\}\cap \{Z \leq z\} \in F$ (for all $y, z\in \mathbb{R}$).
The random variables $Y:\Omega\rightarrow\mathbb{R}$ and $Z:\Omega\rightarrow\mathbb{R}$ are defined to be independent if
$$ P[Y \leq y, Z\leq z] = P[Y\leq y]P[Z\leq z] \quad \forall y, z \in \mathbb{R}$$
Notice that the definition of independent requires $\{Y \leq y\} \cap \{Z \leq z\} \in F$ for all $y, z \in \mathbb{R}$, which of course requires $Y$ and $Z$ to be defined on the same space.
Best Answer
As in the comments, it seems like there's some confusion here over how to set up your notation. In typical usage, you are interested in one specific probability space $(\Omega, \mathcal F, \mathbb P)$, and all random variables are measurable functions of that space. You could for example set $(\Omega, \mathcal F) = (\mathbb R^{k+l}, \mathcal B^{k+l})$, and define $\mathbb P$ as in your display, though this is by no means required.
You would then define random variables $X_1$ and $X_2$ as measurable functions on $(\mathbb R^{k+l}, \mathcal B^{k+l})$; if $X_1$ only depends on the first $k$ coordinates, and $X_2$ on the last $l$, then those are just some properties the functions happen to satisfy. You wouldn't normally start with some other random variables $X_1$ and $X_2$, defined on two other spaces, and try to copy them over.
To answer your questions, the push-forward measure $\mathbb P_{X_1}$ is then straightforwardly $\mathbb PX_1^{-1}$. If random variables $X$ and $Y$ are defined on two spaces $S$ and $T$, then it does not make sense to say they have a joint distribution on some third space $U$; they're defined on $S$ and $T$.
You could, if you like, define some new random variables $X'$ and $Y'$ on $U$, which have the same marginal distributions as $X$ and $Y$, but these would be different variables, and their push-forward measures would be defined in the usual way.