Is the distribution of $(X_1,X_2)$ determined by the distributions of $X_1$, $X_2$ and $X_1+X_2$?
In the discrete case, a simple dimensional analysis shows that the answer is "No", in general.
Assume that $(X_1,X_2)$ takes values in $\{0,1,\ldots,n\}\times\{0,1,\ldots,m\}$ then $X_1+X_2$ takes values in $\{0,1,\ldots,n+m\}$. The distributions of $X_1$, $X_2$, $X_1+X_2$ and $(X_1,X_2)$ depend on $n$, $m$, $n+m$ and $nm+n+m$ parameters respectively. Hence the three first distributions cannot determine the last distribution when $nm+n+m\gt n+m+(n+m)$, that is, when $(n-1)(m-1)\geqslant2$, for example if $\{n,m\}=\{2,3\}$.
A fortiori the answer is also "No" when the distributions of $X_1$, $X_2$ and $X_1+X_2$ are absolutely continuous. An example is when $X_1$ and $X_2$ are Cauchy with parameter $a$ and $X_1+X_2$ is Cauchy with parameter $2a$. Then $(X_1,X_2)$ can be independent, or one can have $X_1=X_2$ with full probability.
The probability is zero. Write $Y:=X_1-X_2$. By independence, $Y$ has a gaussian distribution with mean $m_1-m_2$ and variance $\sigma^2_1+\sigma^2_2$. The probability you seek is $P(Y=0)$.
As for the error in your argument, the transition to step (6) is incorrect. The probability that a gaussian variable takes any specific value is 0; you don't plug in the density in place of $P(X_1=x)$ or $P(X_2=x)$.
Also, step (1) [and therefore step (4)] is not quite right. If you want to condition on $X_1$, the proper way to do this is to write:
$$P(X_1=X_2) = \int P(X_1=X_2\mid X_1=y)f_{X_1}(y)\,dy$$
Then the gaussian density occurs in the integral, but the other factor $P(X_1=X_2\mid X_1=y)$ is zero for all $y$.
Best Answer
Another way to show that is to use Moment Generating Function and its properties.
$$M_{X_1}(t)=e^{\sigma_1^2 t^2/2}$$
$$M_{X_2}(t)=e^{\sigma_2^2 t^2/2}$$
and, by independence
$$M_{X_1+X_2}(t)=e^{\sigma_1^2 t^2/2}\cdot e^{\sigma_2^2 t^2/2}=e^{(\sigma_1^2+\sigma_2^2) t^2/2}$$
Which is the MGF of a $N(0;\sigma_1^2+\sigma_2^2)$