No, your calculation for the continuous case is wrong. It should be
$ P(X_1 = X_2) = \displaystyle\iint_D f_{X_1}(x) f_{X_2}(y)\ dx \ dy$, where $D = \{(x,y) \in {\mathbb R}^2: x = y\}$. But $D$ has two-dimensional measure (i.e. area) $0$, so the answer is $0$.
I will try to address the question you posed in the comments, namely:
Given 3 independent random variables $U$, $V$ and $W$ uniformly distributed on $(0,1)$, find the joint probability distribution function of $X=U+V$ and $Y=U+W$.
Gives $0<x<2$ and $0<y<2$, we first compute $F_{X,Y}(x,y)$:
$$\begin{eqnarray}
F_{X,Y}(x,y) &=& \mathbb{P}\left(X \leqslant x, Y \leqslant y\right) = \mathbb{P}\left(U+V \leqslant x, U+W \leqslant y\right) = \mathbb{E}\left(\mathbb{P}\left(U+V \leqslant x, U+W \leqslant y|U\right)\right)\\
&=& \mathbb{E}\left(F_V(x-U)F_W(y-U)\right) = \mathbb{E}\left(\min(1, \max(x-U,0))\min(1, \max(y-U,0))\right)
\end{eqnarray}
$$
The pdf, being a derivative of the cdf, will then read:
$$
f_{X,Y}(x,y) = \frac{\partial}{\partial x}
\frac{\partial}{\partial y} F_{X,Y}(x,y) = \mathbb{E}\left( [1+U>x>U] [1+U > y>U] \right) = \mathbb{P}\left(U < x < 1+U \land U < y<1+U\right)
$$
The evaluation of the latter expectation is straightforward but tedious, so I asked Mathematica for help:
Best Answer
Because $X_2$ is positive with almost surely, cumulative distribution function for $Z=X_1/X_2$ is $$ F_Z(z) = \mathbb{P}(Z \leqslant z) = \mathbb{P}(X_1 \leqslant z X_2) = \mathbb{E}_{X_2}\left( \mathbb{P}(X_1 \leqslant z X_2 | X_2)\right) = \mathbb{E}_{X_2}\left( F_{X_1}(z X_2)\right) $$ Clearly $F_Z(z)=0$ for $z\leqslant 0$, so assume $z > 0$ $$ F_Z(z) = \int_0^1 \left\{ \begin{array}{cl} z x_2 & 0 < x_2 <1/z \\ 1 & x_2 > 1/z \end{array} \right. \mathrm{d} x_2 = \frac{z}{2} \left(\frac{1}{\max(z,1)}\right)^2 + \left(1 - \frac{1}{\max(z,1)}\right) = \left\{\begin{array}{cl} \frac{z}{2} & 0< z\leqslant 1 \\ 1-\frac{1}{2z} & z > 1 \end{array} \right. $$ The probability density is obtained by differentiation.