We can fix $y$ and then consider $f(x,y)$ to be strictly a function of $x$ (say $h(x)=f(x,y)$ if you want, which makes sense because $y$ is fixed somewhere). Doing this yields the Taylor expansion seen in equation $(1)$. We would say that, by the usual formula, we have
$$h(x)=h(x_0)+h'(x_0)(x-x_0)+h''(x_0)(x-x_0)^2+\cdots \tag{0}$$
Now $h(x_0)=f(x_0,y)$ and $h'(x_0)=f_x(x_0,y)$ and $h''(x_0)=f_{xx}(x_0,y)$ and so forth. Now notice that this works regardless of what we fixed $y$ at to begin with, so it must hold for all available $y$.
We will switch gears now; since the formula holds for all $y$ we no longer need to consider $y$ fixed.
The Taylor expansion in the variable $x$ seen in $(0)$ involves a number of terms like $f_{xx}(x_0,y)$. Now we know that $x_0$ is fixed but $y$ is not, so this is a function of $y$! Say $g(y)=f_{xx}(x_0,y)$. Then we can speak of its Taylor expansion just as well, $g(y)=g(y_0)+g'(y)(y-y_0)+g''(y)(y-y_0)+\cdots$.
It turns out that $g(y_0)=f_{xx}(x_0,y_0)$ and $g'(y_0)=f_{xxy}(x_0,y_0)$ and $g''(y_0)=f_{xxyy}(x_0,y_0)$ and so on, because taking partial derivatives commutes with "evaluating at $y=y_0$" (or $x=x_0$, as in the last part). This means that differentiation and plugging things in can be done in any order here.
Doing a Taylor series expansion in the variable $y$ for $f_{xx}(x_0,y)$, as seen in $(4)$, can be done with any of the terms in $(1)$, like $f(x_0,y)$ and $f_x(x_0,y)$ seen in $(2)$ and $(3)$.
I have assumed that $f$ is sufficiently nice in this answer.
Hint/Partial Solution:
Recall the definition of the Taylor Series formula at a point $a$:
$$\sum_{k=0}^\infty \frac{f^{(k)}(a)}{k!}(x-a)^k$$
(Note that $f^{(k)}(a)$ is a short-hand for the $k$th derivative of $f(x)$ evaluated at $a$)
Letting $f(x)=\sin x$, we note that the numerator of the fraction is the following cycle:
$$\sin(c), \cos(c), -\sin(c), -\cos(c), \cdots$$
The rest is just going to be expanding the series. It won't condense well if $a$ is not a rational multiple of $\pi$ though. The same logic follows for the cosine expansion. As an example of this, let $x=1$. Then we get the following series:
$$\sum_{k=0}^\infty \frac{\sin^{(k)}(1)}{k!}(x-1)^k$$
Expanded, this becomes
$$\sin(1)+(x-1) \cos(1)-\frac{1}{2}(x-1)^2 \sin(1)-\frac 16 (x-1)^3 \cos(1)+\frac{1}{24} (x-1)^4 \sin(1)+\cdots$$
After seeing this, you might think to yourself that this could be broken up into a sum without any derivative symbols. That hypothetical you is right! We can rewrite this as
$$\sum_{k=0}^{\infty} \left(\frac{\sin(1)(-1)^k}{(2k)!}(x-1)^{2k}+\frac{\cos(1)(-1)^{k+1}}{(2k+1)!}(x-1)^{2k+1}\right)$$
A similar formula emerges for the cosine expansion, but I leave that up to the OP!
Best Answer
You can check this file "Higher-Order Derivatives and Taylor’s Formula in Several Variables" By: G. B. Folland https://sites.math.washington.edu/~folland/Math425/taylor2.pdf