To expand on whuber's comment on the OP's question
and the discussion thereafter,
when $X$ and $Y$ are independent random
variables, so are $X$ and $-Y$ independent random variables. Since $Y$
has a *symmetric* distribution meaning that the (marginal)
distribution of $-Y$ is
the same as the (marginal) distribution of $Y$, it is also true that
the *joint* distribution of $(X,Y)$ (which, because of independence,
is the product of the marginal
distributions of $X$ and $Y$) is the *same* as the joint distribution of
$(X,-Y)$ (which is the product of the marginal distributions of $X$ and
$-Y$ since $X$ and $-Y$ are also independent). Consequently, the
distribution of $XY$ is the *same* as the
distribution of $X(-Y) = -XY$, that is, $XY$ has a symmetric distribution.

This result cannot be shown to hold when $X$ and $Y$ are *dependent*
random variables: that the marginal distribution of $Y$ is symmetric
does not guarantee that the joint distribution of $(X,Y)$ is the
same as the joint distribution of $(X,-Y)$. As whuber points out,
in the extreme case of $X = Y$, $XY = X^2$ cannot take on negative values and
so cannot have the same distribution as $-XY=-X^2$ which cannot take
on positive values.

For the special case when $X$ and $Y$ are jointly continuous and
thus $XY = Z$ is a continuous random variable (as in Arthur's answer),
note that for $z > 0$,
$$\begin{align}
P\{Z > z\} &= \int_{x=0}^\infty \int_{y=\frac zx}^\infty f_{X,Y}(x,y)
\,\mathrm dy\, \mathrm dx
+ \int_{x=-\infty}^0\int_{y=-\infty}^{\frac zx} f_{X,Y}(x,y)
\,\mathrm dy\, \mathrm dy\\
P\{Z < -z\} &= \int_{x=0}^\infty \int_{y=-\infty}^{\frac{-z}{x}} f_{X,Y}(x,y)
\,\mathrm dy\, \mathrm dx
+ \int_{x=-\infty}^0\int_{y=\frac{-z}{x}}^{\infty} f_{X,Y}(x,y)
\,\mathrm dy\, \mathrm dx
\end{align}$$
which upon differentiating with respect to $z$ leads us to
$$f_Z(z) = \int_{x=-\infty}^\infty \frac{1}{|x|}f_{X,Y}\left(x,\frac zx\right)
\, \mathrm dx ~ -\infty < z < \infty.$$
From this, we get that $f_Z(z) = f_Z(-z)$ holds whenever $f_{X,Y}(x,y)$
enjoys the property that $f_{X,Y}(x,y) = f_{X,Y}(x,-y)$ for all
$x,y \in (-\infty, \infty)$; $X$ and $Y$ need not be independent
e.g., this property holds if $(X,Y)$ is uniformly distributed on the
interior of the triangle with vertices at $(0,1), (0,-1), (1,0)$.
Note that $f_{X,Y}(x,y) = f_{X,Y}(x,-y)$ implies that $f_Y(y)$ is an
even function of $y$, that is, the distribution of $Y$ is symmetric.

For the special case when $X$ and $Y$ are *independent* random
variables, we have that
$f_{X,Y}(x,y) = f_X(x)f_Y(y)$ equals $f_{X,Y}(x,-y)=f_X(x)f_Y(-y)$
whenever $f_Y(y) = f_Y(-y)$ for all $y$.

To say that a random variable $W$ "has a symmetric distribution around zero" is saying that $W$ and $-W$ have the same distribution.

Let $X$ be another random variable and set $Y=WX.$ By the rules of algebra, $-Y = -WX = (-W)X$ must have the same distribution as $WX$ *provided* $W$ and $X$ are independent. Therefore $Y$ is symmetric around zero.

To see why independence is a necessary assumption, let $(W,X)$ take on the values $(-1,0),$ $(-1,1),$ $(1,0),$ and $(1,1)$ with probabilities $1/3,1/6,1/6,$ and $1/3,$ respectively. Because $W$ has equal chance of being $\pm 1,$ it is symmetric around zero. ($X$ is symmetric about $1/2.$) But $Y=WX$ takes on the value $0$ with probability $1/3+1/6,$ the value $-1$ with probability $1/6,$ and the value $1$ with probability $1/3,$ and therefore is not symmetric at all.

$$\begin{array}{crrr}
& \Pr & W & X & Y \\
\hline
& \frac{1}{3} & -1 & 0 & 0\\
& \frac{1}{6} & -1 & 1 & -1\\
& \frac{1}{6} & 1 & 0 & 0\\
& \frac{1}{3} & 1 & 1 & 1
\end{array}$$

## Best Answer

You probably need to add the condition that $\epsilon$ and $X$ are independent, otherwise the conclusion may not hold. For example, if $X, X' \text{ i.i.d. } \sim N(0, 1)$, $\epsilon = 2I_{\{X - X' > ~0\}} - 1$. Since $X - X' \sim N(0, 2)$, on one hand, $P(X - X' \leq 1) = \Phi(1/\sqrt{2})$. On the other hand, \begin{align*} P(\epsilon(X - X') \leq 1) &= P(X - X' \leq 1, X - X' > 0) + P(X - X' \geq -1, X - X' \leq 0) \\ &= P(0 < X - X' \leq 1) + P(-1 \leq X - X' \leq 0) \\ &= \Phi(1/\sqrt{2}) - \frac{1}{2} + \frac{1}{2} - \Phi(-1/\sqrt{2}) \\ &= 2\Phi(1/\sqrt{2}) - 1 \neq \Phi(1/\sqrt{2}). \end{align*}

When $\epsilon$ and $X$ are assumed independent, the conclusion indeed holds: as for any $x \in \mathbb{R}$, \begin{align*} P(\epsilon(X - X') \leq x) &= P(X - X' \leq x, \epsilon = 1) + P(X - X' \geq -x, \epsilon = -1) \\ &= P(X - X' \leq x)P(\epsilon = 1) + P(X - X' \geq -x)P(\epsilon = -1) \\ &= \frac{1}{2}P(X - X' \leq x) + \frac{1}{2}P(X - X' \geq -x) \\ &= P(X - X' \leq x). \end{align*} The last equality holds because if $F$ is the common distribution function of $X$ and $X'$, then \begin{align*} & P(X - X' \leq x) = \int_{\mathbb{R}}P(X - t \leq x)~\mathrm dF(t) = \int_{\mathbb{R}}F(x + t)~\mathrm dF(t), \\ & P(X - X' \geq -x) = \int_{\mathbb{R}}P(t - X' \geq -x)~\mathrm dF(t) = \int_{\mathbb{R}}F(x + t)~\mathrm dF(t). \\ \end{align*}