I assume you are trying to solve
$$
x'=F(x)\qquad x(0)=x_0.
$$
Existence: Let $C$ be a Lipschitz constant for $F$.
The idea is to define recursively a sequence of of continuous functions on $I:=[-1/2C,1/2C]$ by
$$
x_0(t):=x_0\qquad x_{n+1}(t):=x_0+\int_0^tF(x_n(s))ds.
$$
Clearly, this defines continuous functions (and therefore $C^1$ on $\mathbb{R}$). But when we restrict to $I$, we have
$$
|x_{n+1}(t)-x_n(t)|\leq\left|\int_0^t|F(x_n(s))-F(x_{n-1}(s))|ds\right|\leq C|t|\sup_I|x_{n}-x_{n-1}|.
$$
Therefore
$$
\|x_{n+1}-x_n\|_\infty\leq \frac{1}{2}\|x_n-x_{n-1}\|_\infty
$$
with the sup norm over $I$.
It follows that $(x_n)$ is Cauchy in the Banach space $C^0(I)$. So it converges to a continuous function $x$ on $I$ which, by uniqueness of limit in a metric space and continuity of the recurrence formula, satisfies
$$
x(t)=x_0+\int_0^tF(x(s))ds \qquad\forall t\in I.
$$
Clearly, $x(0)=x_0$ so the initial condition is fulfilled. Now by the fundamental theorem of calculus, the rhs is differentiable and we get
$$
x'(t)=F(x(t))\qquad\forall t\in\left(-\frac{1}{2C},\frac{1}{2C}\right).
$$
So you have a local solution.
Uniqueness: Any two solutions on this interval, satisfy the fixed point condition
$$
x(t)=x_0+\int_0^t(F(x(s))ds.
$$
So the difference of two such solutions has $\|x_1-x_2\|_\infty\leq\frac{1}{2}\|x_1-x_2\|_\infty$. Hence $x_1=x_2$.
Maximal extension:
Consider the set of all extensions of the unique solution we have just found, and which are still solutions of the ode on the extended interval where they are defined. This is naturall ordered, partially, by inclusion of the intervals of definition. The key remark is that if $x_1$ and $x_2$ are two extensions on $I_1$ and $I_2$ respectively, then they coincide on $I_1\cap I_2$. This can be shown by a standard connedtedness argument (open closed non empty imlies all), with the help of the local uniqueness we have just shown, applied similarly to a different initial condition. If $(x_\alpha,I_\alpha)$ denotes the set of all extensions, then $\bigcup_\alpha I_\alpha$ is an interval (remmber all the $I_\alpha$ contain $I$ above, this suffices to get that) and we can define $x(t):=x_\alpha(t)$ on $I_\alpha$ without ambiguity, thanks to the key remark above. This is clearly a maximal extension. And since it extends every extension, there is no other maximal extension.
Conclusion: there exists a unique maximal solution.
Generalization: The whole thing works the same for $x'=F(t,x)$ when $F$ is locally Lipschitz in the second variable. This covers way more situations. That's why I gave the argument in a way which can be applied verbatim to this more general case.
Domain of the maximal extension: The big difference is here. In the locally Lipschitz case, the interval where the maximal solution is defined is not necessarily the whole interval where $F$ is defined. In the Lipschitz case like yours, and in particular in the linear case, the domain of the maximal extension is $I$ if $F$ is continuous on $I\times \mathbb{R}$, and Lipchitz in the second variable. Indeed, in this case, we have a uniform control on the extensions, so whenever a solution is defined on an interval stricly contained in $I$, it can be extended. So the maximal solution must be defined on $I$. In our case, it is $\mathbb{R}$.
Best Answer
The actual hypothesis required by the Picard-Lindelöf theorem, if I am not mistaken, is: for every pair $(t,y)$, there exists some open intervals $t \ in I$, $y \in J$ and some constant $K > 0$, such that if $t’ \in I$ and $y’,y’’ \in J$, $|e^{\sin(ty’)}-e^{\sin(ty’’)}| \leq K|y’-y’’|$.
(ie the function is locally Lipschitz continuous wrt y).
So here, you take $I=]t-1,t+1[$, $J=\mathbb{R}$, and recall that $u \longmapsto e^{\sin(su)}$ is $e|s|$-Lipschitz continuous.