Take $X$ and $Y$ with probabilities $P(X=1)=P(X=2)=P(Y=1)=P(Y=2)=0.5$ and which are independent. Then $$P(X=Y) = P(X=1, Y=1) + P(X=2,Y=2) =\\= P(X=1)P(Y=1)+P(Y=2)P(Y=2)=0.5,$$ meaning that $X=Y$ holds with probability $0.5$, not $1$.
$X_{2n}$ is the indicator of $[\frac{1}{2},1]$ and $X_{2n+1}$ is the indicator of $[0,\frac{1}{2})$, then
$$P(X_{2n}=1)=P \left(\{\omega:\omega\in[\frac{1}{2},1]\}\right)=\frac{1}{2}=P(X_{2n}=0)$$
$$P(X_{2n+1}=1)=P \left(\{\omega:\omega\in[0,\frac{1}{2})\}\right)=\frac{1}{2}=P(X_{2n+1}=0)$$
Now, let $F_{2n}(x)$ be the distribution function, then $F_{2n}(x)=P(X_{2n}\leq x)=1$ if $x=1$, and is $\frac{1}{2}$ if $x\in[0,1)$. For $F_{2n+1}(x)$ you have the same distribution function. Therefore $F_n$ does not depend on $n$, then is constant, which implies convergence in distribution.
Now, to proof it does not converge in probability, assume the contrary, i.e. there exists $X$ such that $\lim P(|X_n-X|>\epsilon)\to 0$. Because the limit is unique and convergence in distribution implies convergence in probability, you have that the distribution function of $X$ is the one you obtained before, therefore $X$ only takes two values, $0$ and $1$, and both with the same probability ($\frac{1}{2}$).
Note that convergence in probability implies that $X_{2n}$ and $X_{2n+1}$ both converge in probability to $X$, so $X_{2n}-X_{2n+1}$ converges to $0$ in probability, however, $X_{2n}-X_{2n+1}$ does not converge in distribution to $0$ which is a contradiction because convergence in probability implies convergence in distribution.
Best Answer
If rv $U$ is uniformly distributed on $(-1,1)$ then so is $-U$.
They have the same distribution but $U=-U\iff U=0$ and $P(U=0)=0$.
If you want this under the extra condition that the random variables are independent, then let $(X,Y)$ be uniformly distributed on $(0,1)^2$. Then $X$ and $Y$ are independent and both uniformly distributed on $(0,1)$ (so they are iid). However: $P(X=Y)=0$.