First of all, a bit of intuition. The covariance of two random variables is a statistic that tells you how "correlated" two random variables are. If two random variables are independent, then their covariance is zero. If their covariance is nonzero, then the value gives you an indication of "how dependent they are".
Now, onto your problem.
I think you might find some use in the general formula for calculating expectations. For any random variable $X$ taking discrete values in $\mathbb{N}$, and any nonnegative measurable function $f$, you have
$$
\mathbb{E}[f(X)]=\sum_{k\in\mathbb{N}}f(k)\mathbb{P}(X=k).
$$
In your specific example, $Z=XY$ is a new random variable, and not a couple $(X,Y)$ as you seem to write in your example.
So, $Z$ is a random variable. What values does it take? Let us look at all the possibilities:
$$
\mathbb{P}(Z=1)=\mathbb{P}(X=1,Y=1)=0,
$$
because if $X=1$ then $Y=0$. Similarly,
$$
\mathbb{P}(Z=2)=\mathbb{P}(X=2,Y=1)=\mathbb{P}(X=2)=\frac16,
$$
because if $X=2$ then $Y=1$. Continuing the process, you find
$$
\mathbb{P}(Z=3)=\mathbb{P}(X=5)=0,\text{ and }\mathbb{P}(Z=4)=\mathbb{P}(Z=6)=\frac16.
$$
It remains to note that $Z$ can also take the value 0, and this happens whenever $Y=0$:
$$
\mathbb{P}(Z=0)=\mathbb{P}(X=1)+\mathbb{P}(X=3)+\mathbb{P}(X=5)=\frac16+\frac16+\frac16=\frac12.
$$
Once you know all the probabilities characterizing the distribution of $Z$, you can compute its expectation according to the previous formula:
$$
\mathbb{E}[Z]=2\cdot\frac16+4\cdot\frac16+6\cdot\frac16+0\cdot\frac12=\frac{12}6=2.
$$
Where you went wrong is in the step where yu say $X$ and $Y$ are independent (true) but use that as if you had said $2Y-X$ and $Y$ are independent (false). Surely if you specify the value of $Y$, that gives you some information about the value of $2Y-X$.
So you need to start with the equation you said you would like to proceed with:
$$
\mbox{cov }(Y,2Y-X) = E[ Y(2Y-X)] -E[Y]E[2Y-X] \\
\mbox{cov }(Y,2Y-X) = 2E[ Y^2] - E[YX] - E[Y]E[X]
$$
Now with your probability distribution as given:
$$
E[Y] = \frac34(0) + \frac14(1) = \frac14\\
E(Y^2) = \frac34(0^2) + \frac14(1^2) = \frac14\\
E[X] = \frac12(-1)+\frac14(0)+\frac14(1) = -\frac14\\
E[YX]=\frac34\frac12(0\cdot (-1)) + \frac34\frac14(0\cdot (0)) + \frac34\frac14(0\cdot (1)) \\+ \frac14\frac12(1\cdot (-1)) + \frac14\frac14(1\cdot (0)) + \frac14\frac14(1\cdot (1)) = -\frac18+\frac1{16} =
-\frac1{16}
$$
And then
$$
\mbox{cov }(Y,2Y-X) = 2E[ Y^2] - E[YX] - E[Y]E[X] \\=
2\cdot \frac14 - \left(-\frac1{16}\right)- \frac14 \cdot \frac14 = \frac12-2 \left(\frac1{16}\right) = \frac38
$$
Best Answer
The definition for covariance of two random variables, is that it is the expected product of their displacements from their means.
$$\mathsf{Cov}(X,Y)=\mathsf E((X-\mathsf E(X))\,(Y-\mathsf E(Y)))$$
Which gives us that: $$\begin{align}\mathsf{Cov}(X,Y)&=\mathsf E(XY-X\,\mathsf E(Y)-Y\,\mathsf E(X)+\mathsf E(X)\,\mathsf E(Y))\\&=\mathsf E(XY)-\mathsf E(X\,\mathsf E(Y))-\mathsf E(Y\,\mathsf E(X))+\mathsf E(\mathsf E(X)\,\mathsf E(Y))\\&=\mathsf E(XY)-\mathsf E(X)\,\mathsf E(Y)-\mathsf E(Y)\,\mathsf E(X)+\mathsf E(X)\,\mathsf E(Y)\\&=\mathsf E(XY)-\mathsf E(X)\,\mathsf E(Y) \end{align}$$
The covariance of a random variable and itself is called the variance.$$\begin{split}\mathsf{Var}(X)&=\mathsf{Cov}(X,X)\\&=\mathsf E((X-\mathsf E(X))^2)\\&=\mathsf E(X^2)-\mathsf E(X)^2\end{split}$$
The usefulness of covariance is that comparing it to the product of the variances (more specifically to the square root of that product) gives a measure for how linearly dependent the two random variables may be. This is the correlation coefficient.
$$\mathsf{Corr}(X,Y)=\dfrac{\mathsf{Cov}(X,Y)}{\surd(\mathsf{Var}(X)\,\mathsf{Var}(Y))}$$