The covariance between $X$ and $Y$ is defined as
$$
\mathrm{Cov}(X,Y)={\rm E}[(X-\mu_X)(Y-\mu_Y)],
$$
where $\mu_X={\rm E}[X]$ and $\mu_Y={\rm E}[Y]$ are the two means. By Cauchy-Schwarz' inequality we have
$$
\begin{align}
|\mathrm{Cov}(X,Y)|&=|{\rm E}[(X-\mu_X)(Y-\mu_Y)]|\leq {\rm E}\left[(X-\mu_X)^2\right]^{1/2}{\rm E}\left[(Y-\mu_Y)^2\right]^{1/2}\\
&=\sqrt{\mathrm{Var}(X)\mathrm{Var}(Y)}.
\end{align}
$$
where
$$
\mathrm{Var}(X)\mathrm{Var}(Y)=\left\{\int (x-\mu_X)^2\, f_X(x)\,\mathrm dx\right\}\cdot\left\{\int (y-\mu_Y)^2f_Y(y)\,\mathrm dy\right\}.
$$
Every value between the lower and upper bound can be realized. To see this let $X,X'$ be i.i.d. and $U$ a Bernoulli variable independent of $(X,X')$ with $P(U=1)=p$. If we let $Y=UX+(1-U)X'$, then
$$
\mathrm{Cov}(Y,X)=\mathrm{Cov}(UX,X)+\mathrm{Cov}((1-U)X',X)=\mathrm{Cov}(UX,X).
$$
By expanding this we obtain
$$
\mathrm{Cov}(Y,X)=E[U]\mathrm{Var}(X)=p\sqrt{\mathrm{Var}(X)\mathrm{Var}(Y)}
$$
since $X\sim Y$. Letting $p$ vary between $0$ and $1$ we can realize all values between $0$ and the upper bound $\sqrt{\mathrm{Var}(X)\mathrm{Var}(Y)}$.
Finally, let $Y=-(UX+(1-U)X')$, then we still have $X\sim Y$, but now
$$
\mathrm{Cov}(X,Y)=-p\sqrt{\mathrm{Var}(X)\mathrm{Var}(Y)}.
$$
If we know the distribution of the $X_2$, we can find the distribution of the $-X_2.$ I tried to do it as straightforward as possible.
As $F_\xi(y)$ I denote the CDF of the random variable $\xi$.
$$F_{-X_2}(y)=\mathbb{P}(-X_2\leq y)=\mathbb{P}(X_2 \geq -y)=1-\mathbb{P}(X_2\le -y)=\\=1-F_{X_2}(-y).$$
Now, to find the density of $-X_2$, you can just differentiate $F_{-X_2}(y)$ :
$$f_{-X_2}(y)=\frac{d}{dy}\left(F_{-X_2}(y)\right)=\frac{d}{dy}(1-F_{X_2}(-y))=\\=f_{X_2}(-y)=\mathbf{1}_{\{-\ln4\leq y\leq 0\}}e^{\frac{y}{2}}$$
By $\mathbf{1}_{A}(y)$ where $A$ is an event I denote indicator function. It equals $1$ whenever $y \in A$ and equals zero otherwise. It's just an elegant way to define a piecewise function.
Without indicator you can write it as follows :
$$f_{-X_2}(y)=\mathbf{1}_{\{-\ln4\leq y\leq 0\}}(y) \cdot e^{\frac{y}{2}} = \begin{cases}e^{\frac{y}{2}}, -\ln4\leq y\leq 0 \\ 0, \text{otherwise.} \end{cases}$$
Therefore you have two random variables, let's denote them $X:=X_1$ and $Y := -X_2$ and you need to find PDF of $X+Y$. And you know their densities so you can just use the convolution of the sum as you wanted.
UPDATE.
So you can check your work I added my calculations of the $f_{X+Y}(y)$. I will use indicator function during integrations to make derivations shorter.
For now we have : $f_X(x) = \mathbf{1}_{\{0 \leq x \leq \ln 4\}}e^{-\frac{x}{2}}, f_Y(y) = \mathbf{1}_{\{-\ln4 \leq y \leq 0\}}e^{\frac{y}{2}}$
Using convolution, we get :
$$f_{X+Y}(y)=\int\limits_{\mathbb{R}}f_X(x)f_Y(y-x)dx = \int\limits_{\mathbb{R}}\mathbf{1}_{\{0 \leq x \leq \ln 4\}}e^{-\frac{x}{2}}\mathbf{1}_{\{-\ln4 \leq y-x \leq 0\}}e^{\frac{y-x}{2}}dx = \\ =e^{\frac{y}{2}} \int\limits_{\mathbb{R}}\mathbf{1}_{\{0 \leq x \leq \ln 4\}}e^{-\frac{x}{2}}\mathbf{1}_{\{y \leq x \leq \ln4+y\}}e^{-\frac{x}{2}}dx = e^{\frac{y}{2}}\int\limits_{\max\{0,y\}}^{\min\{\ln4, \ln4+y\}}e^{-x}dx$$
To evaluate this integral I will consider different cases.
If $0 \leq y \leq \ln4$, we have :
$$f_{X+Y}(y)=e^{\frac{y}{2}}\int\limits_{y}^{\ln4}e^{-x}dx=-\frac{1}{4}e^{\frac{y}{2}} + e^{-\frac{y}{2}}$$
If $-\ln4 \leq y < 0 $, we have :
$$f_{X+Y}(y)=e^{\frac{y}{2}}\int\limits_{0}^{\ln4+y}e^{-x}dx=e^{\frac{y}{2}}-\frac{1}{4}e^{-\frac{y}{2}}$$
If $y > \ln 4$ or $y < - \ln 4$, $f_{X+Y}(y)$ will be equal to $0$.
Finally,
$$f_{X+Y}(y) = \mathbf{1}_{\{0 \leq y \leq \ln 4 \}}(y) \cdot \left(-\frac{1}{4}e^{\frac{y}{2}} + e^{-\frac{y}{2}}\right) + \mathbf{1}_{\{-\ln 4 \leq y < 0 \}}(y) \cdot\left(e^{\frac{y}{2}}-\frac{1}{4}e^{-\frac{y}{2}}\right)$$
We can check that it's integrates to $1$ :
$$\int\limits_{\mathbb{R}}\mathbf{1}_{\{0 \leq y \leq \ln 4 \}}(y) \cdot \left(-\frac{1}{4}e^{\frac{y}{2}} + e^{-\frac{y}{2}}\right)dy = \int\limits_{0}^{\ln 4}(-\frac{1}{4}e^{\frac{y}{2}} + e^{-\frac{y}{2}})dy = \frac{1}{2},$$
$$\int\limits_{\mathbb{R}}\mathbf{1}_{\{-\ln 4 \leq y < 0 \}}(y) \cdot\left(e^{\frac{y}{2}}-\frac{1}{4}e^{-\frac{y}{2}}\right)dy = \int\limits_{-\ln 4}^{0}(e^{\frac{y}{2}}-\frac{1}{4}e^{-\frac{y}{2}})dy = \frac{1}{2}$$
Hence, by additivity of the integral,
$$\int\limits_{\mathbb{R}}f_{X+Y}(y)dy=\frac{1}{2}+\frac{1}{2}=1$$
Without indicator function, we can rewrite it as follows :
$$f_{X+Y}(y)=\begin{cases}e^{\frac{y}{2}}-\frac{1}{4}e^{-\frac{y}{2}} , -\ln4 \leq y < 0 \\ -\frac{1}{4}e^{\frac{y}{2}} + e^{-\frac{y}{2}}, 0 \leq y \leq \ln 4 \\0, \text{otherwise}\end{cases}$$
Best Answer
You could directly use $Y = 1-X$ to obtain $$E[XY] = E[X(1-X)] = E[X] - E[X^2] = \frac{1}{2} - \frac{1}{3} = \frac{1}{6}.$$