The definition for covariance of two random variables, is that it is the expected product of their displacements from their means.
$$\mathsf{Cov}(X,Y)=\mathsf E((X-\mathsf E(X))\,(Y-\mathsf E(Y)))$$
Which gives us that:
$$\begin{align}\mathsf{Cov}(X,Y)&=\mathsf E(XY-X\,\mathsf E(Y)-Y\,\mathsf E(X)+\mathsf E(X)\,\mathsf E(Y))\\&=\mathsf E(XY)-\mathsf E(X\,\mathsf E(Y))-\mathsf E(Y\,\mathsf E(X))+\mathsf E(\mathsf E(X)\,\mathsf E(Y))\\&=\mathsf E(XY)-\mathsf E(X)\,\mathsf E(Y)-\mathsf E(Y)\,\mathsf E(X)+\mathsf E(X)\,\mathsf E(Y)\\&=\mathsf E(XY)-\mathsf E(X)\,\mathsf E(Y)
\end{align}$$
The covariance of a random variable and itself is called the variance.$$\begin{split}\mathsf{Var}(X)&=\mathsf{Cov}(X,X)\\&=\mathsf E((X-\mathsf E(X))^2)\\&=\mathsf E(X^2)-\mathsf E(X)^2\end{split}$$
The usefulness of covariance is that comparing it to the product of the variances (more specifically to the square root of that product) gives a measure for how linearly dependent the two random variables may be. This is the correlation coefficient.
$$\mathsf{Corr}(X,Y)=\dfrac{\mathsf{Cov}(X,Y)}{\surd(\mathsf{Var}(X)\,\mathsf{Var}(Y))}$$
1 - Not quite. The covariance (more specifically, the correlation coefficient), measures how much the variables "depend" on each other linearly. A great correlation coefficient shows us that one variable is "almost" a linear function of the other, I.e. there is $a$ and $b$ such that $Y \approx aX+b$. If two random variables are uncorrelated, that does not mean that here is no relation of dependence between them - just that this relation cannot be a linear one.
2 - Well, you just anwered this in your question: use the expectations! We can compute expectations of random variables even if they assume an uncountable number of values.
Best Answer
$\newcommand{\E}{\operatorname{\mathbb{E}}}$ $\newcommand{\Cov}{\operatorname{\mathbb{Cov}}}$ $\newcommand{\Pr}{\operatorname{\mathbb{Pr}}}$ Just this: $$\begin{align} \E(XY) & = \sum_{\forall y} y\E(X\mid Y=y)\Pr(Y=y) & = \int_{\bf Y} y \E(X\mid Y=y)f_Y(y)\operatorname{d}y \\ \\ \Cov(X,Y) & = \E(XY)-\E(X)\E(Y) \\ & = \underbrace{-\mu\gamma + \sum_{\forall y} y\E(X\mid Y=y)\Pr(Y=y)}_{\text{discrete random variable}} & = \underbrace{-\mu\gamma + \int_{\bf Y} y \E(X\mid Y=y)f_Y(y)\operatorname{d}y}_{\text{continuous random variable}} \end{align}$$