Your reasonning is correct. Since $g$ is Lipschitz and the first equation of the system involves only $x$, there is a unique solution $x(t)$ such that $x(t_0)=x_0$.
The second equation becomes
$$
y'=f(x(t))\,y,\quad y(t_0)=y_0.
$$
It is a linear equation and has a unique solution, given by
$$
y(t)=y_0e\,^{\int_{t_0}^t f(x(s))ds}.
$$
why ... tells that one can check the boundedness of the derivatives instead of continuity to find out whether the function satisfies the Lipschitz condition?
Because boundedness is enough. But if you have continuity of $y$-derivative on a closed rectangle, that implies boundedness of $y$-derivative, and therefore the Lipschitz property of the function with respect to $y$.
Be careful with the Picard theorem, because it comes in different versions. The solution (ex 1.6) (provided by Tony Piccolo) uses a different version from what you stated in the question. As you stated the theorem, it does not apply to very long intervals.
The crux of the matter is to choose $a$ and $b$ that satisfy the conditions of the theorem. We don't get these numbers from any formula: we have to choose them so that the inequalities $aM<1$ and $aK\le b$ work. Since $M$ and $K$ may themselves depend on $a,b$, we may end up chasing our own tail.
But here, the derivative of $\sin (xy)$ with respect to $y$ is bounded by $|x|$, which is independent of $y$. We can take $M$ to be the supremum of $|x|$ on the interval $[x_0-a,x_0+a]$, which is $|x_0|+a$. Also, the function $\sin (xy)$ itself is bounded by $1$ everywhere, so $K=1$.
So, choose $a$ so that $a(|x_0|+a)<1$. Instead of solving the quadratic equation, you can take $a=\frac12 \min(|x_0|^{-1},1)$. Then you apply the theorem repeatedly to expand the interval of existence... it's possible to cover the real line in this way.
But I'll have to say that this is a really laborious way to deal with this ODE. The proper way is to use the better form of the theorem, which does not require $aM<1$. See "optimization of the solution's interval" here.
Best Answer
Consider the differential equation of the first order IVP
$y'=f(x,y)$ with $y(x_0) = y_0$ . . . . .$(1)$
${}$
${}$
Note: Condition $|\frac{∂f}{∂y}|≤ M$, $∀(x, y) ∈ R$ can be replaced by a weaker condition which is known as Lipschitz condition. Thus, instead of continuity of $\frac{∂f}{∂y}$, we require
$|f(x, y_1) − f(x, y_2)| ≤ L|y_1 − y_2|$ $ ∀(x, y_i) ∈ R$.
If $\frac{∂f}{∂y}$ exists and is bounded, then it necessarily satisfies Lipschitz condition. On the other hand, a function $f(x, y)$ may be Lipschitz continuous but $\frac{∂f}{∂y}$ may not exists.
For example $f(x, y) = x^2|y|$, $|x| ≤ 1$, $|y| ≤ 1$ is Lipschitz continuous in $y$ but $\frac{∂f}{∂y}$ does not exist at $(x, 0)$.
${}$
Consider the second order initial value problem
$y''(t)+p(t)y'(t)+q(t)y=g(t)$, with $y(t_0) = y_0, y′(t_0) = y^′_0.$ . . . . . $(2)$
That is, the theorem guarantees that the given initial value problem $(2)$ will always have (existence of) exactly one (uniqueness) twice-differentiable solution, on any interval containing $t_0$ as long as all three functions $p(t), q(t),$ and $g(t)$ are continuous on the same interval. Conversely, neither existence nor uniqueness of a solution is guaranteed at a discontinuity of $p(t), q(t),$ or $g(t)$.
${}$
For more information and examples you can find the following :
$1.$ 'Differential Equations Theory, Technique and Practice' by G. F. Simmons & S. G. Krantz (McGraw Hill Higher Education)
$2.$ "Differential Equations" by Shepley L. Ross
$3.$ http://home.iitk.ac.in/~sghorai/TEACHING/MTH203/ode5.pdf
$4.$ http://www.math.ucsd.edu/~y1zhao/2013UCSDFallQuarterMath20D/Files/Section3.2.pdf
$5.$ http://www.ltcconline.net/greenl/courses/204/ConstantCoeff/uniquenessExistence.htm