A joint distribution has domain $(-\infty, \infty) \times (-\infty, \infty)$. If we partition each component of the cartesian product in two by selecting some value $x$ and some value $y$, then we get $4$ subsets,
$$(-\infty, x] \times (-\infty, y],\;\;(-\infty, x] \times [y,\infty),\\
[x, \infty) \times (-\infty, y],\;\;[x, \infty) \times [y,\infty)$$
made up of intersections of two events,
$$A = P(X\le x), \;\; B = P(Y\le y)$$
and their corresponding complements.
Then (as the OP noted in a commnent),
$$\Pr(X\ge x, Y\ge y) = P(A^c\cap B^c) = 1 - P(A\cup B)$$
$$=1-\big[P(A) + P(B) - P(A\cap B)\big]$$
So it appears that by taking the cross-partial derivative of $\Pr(X\ge x, Y\ge y)$ we should again get the joint density. Let's verify that:
$$\Pr(X\ge x, Y\ge y) = \int_x^{\infty}\int_y^{\infty}f(s,t)dtds$$
$$\frac {\partial \Pr(X\ge x, Y\ge y)}{\partial y} = \int_x^{\infty} \left(\frac{\partial}{\partial y}\int_y^{\infty}f(s,t)dt\right)ds $$
$$=\int_x^{\infty}-f(s,y) ds$$
$$\frac {\partial^2 \Pr(X\ge x, Y\ge y)}{\partial y\partial x} = \frac {\partial }{\partial x} \int_x^{\infty}-f(s,y) ds = -\left(-f(x,y)\right) = f(x,y)$$
The above also means that we can obtain the joint pdf from any of the four joint events indicated by the breakdown of the support -but in the other two cases, we should multiply by $-1$.
$$\begin{align} f(x,y) =& \frac {\partial^2 \Pr(X\le x, Y\le y)}{\partial y\partial x}\\
=&\frac {\partial^2 \Pr(X\ge x, Y\ge y)}{\partial y\partial x}\\
=&-\frac {\partial^2 \Pr(X\le x, Y\ge y)}{\partial y\partial x}\\
=&-\frac {\partial^2 \Pr(X\ge x, Y\le y)}{\partial y\partial x}
\end{align}$$
Best Answer
Let $X$, $Y$ be two RV's and we do not assume that they are independent. Now we are asking questions about the distribution of $X$ given $Y$.
As you stated, the conditional PDF of $X$ given $Y$ is $$f\left(X=x|Y=y\right)=\frac{f\left(X=x,Y=y\right)}{f\left(Y=y\right)}$$
For the ease of understanding, we can define a new continuous variable $Z_{y}$ that is equal in distribution to $X$ for any given $Y=y$, that is:
$$P\left(Z_{y}<z\right)=P\left(X<z|Y=y\right)\,\forall\,z,y$$
and thus we get:
$$f\left(Z_{y}=z\right)=\frac{\partial}{\partial z}P\left(Z_{y}<z\right)=\frac{\partial}{\partial z}P\left(X<z|Y=y\right)=f\left(X=z|Y=y\right)\,\forall\,z,y$$
Note that since we are conditioning on $Y$, when observing $Z_{y}$ we can relate to $y$ as a constant. Now we can answer your questions quite easily using our previous knowledge in univariate probability:
$$P\left(X<x|Y=y\right)=P\left(Z_{y}<x\right)=\int_{-\infty}^{x}f\left(Z_{y}=u\right)du=\int_{-\infty}^{x}f\left(X=u|Y=y\right)du=\frac{1}{f\left(Y=y\right)}\cdot\int_{-\infty}^{x}f\left(X=u,Y=y\right)du$$
And the other way around:
$$f\left(X=x|Y=y\right)=f\left(Z_{y}=x\right)=\frac{\partial}{\partial x}P\left(Z_{y}<x\right)\\=\frac{1}{f\left(Y=y\right)}\cdot\frac{\partial}{\partial x}\int_{-\infty}^{x}f\left(X=u,Y=y\right)du=\frac{f\left(X=x,Y=y\right)}{f\left(Y=y\right)}$$
The major point I'm trying to make here is that for an RV $X$ and for every given $Y=y$ we can define a new RV $Z_{y}\overset{d}{=}\left\{ X|Y=y\right\}$ which we can relate to as a univariate RV with $y$ as a parameter. Everything else will be the same as we learned in basic probability.