Consider the family of curves $\gamma_C$ given by an equation of the form
$$F(x,y)=C\ ,\qquad(1)$$
where the function $$F:\ {\mathbb R}^2\to {\mathbb R}\,\quad (x,y)\to F(x,y)$$
is explicitly given; e.g. $F(x,y):=x^2+y^2$. "Locally" such a $\gamma_C$ is the graph of a function $x\mapsto y(x)$. It follows that the function
$$x\mapsto\phi(x):=F\bigl(x, y(x)\bigr)$$
is identically equal to $C$. Therefore by the chain rule
$$\phi'(x)=F_{.1}\bigl(x,y(x)\bigr)\cdot 1+ F_{.2}\bigl(x,y(x)\bigr)\cdot y'(x)\equiv0\ ,$$
where $F_{.1}$ denotes the partial derivative of $F$ with respect to the first entry, and similarly for $F_{.2}$. Solving the last equation for $y'(x)$ gives
$$y'(x)\equiv -{F_{.1}\bigl(x,y(x)\bigr)\over F_{.2}\bigl(x,y(x)\bigr)}\ .$$
But this is saying that the function $x\mapsto y(x)$ satisfies the differential equation
$$y'=-{F_x(x,y)\over F_y(x,y)}\ .\qquad(2)$$
Here the right side ($=: f(x,y)$) is a known (resp., easily computable) function of $x$ and $y$ which is independent of $C$. Therefore the equation $(2)$ can be considered as the differential equation characterizing the family of curves $\gamma_C$ defined by $(1)$.
Consider the following example: The set $\gamma_C$ of points $(x,y)$ whose distances to the "foci" $(\pm1,0)$ have given product $C>0$ is called a Cassini curve. The equation of this curve is given by
$$F(x,y):=\bigl((x-1)^2+y^2\bigr)\bigl((x+1)^2+y^2\bigr)=C^2\ .$$
When $C=1$ this curve is called a lemniscate. In order to obtain the differential equation of the family of curves $\gamma_C$ we compute
$$F_x=2(x-1)\bigl((x+1)^2+y^2\bigr)+2(x+1)\bigl((x-1)^2+y^2\bigr)=\ldots$$
and
$$F_y=2y\bigl((x+1)^2+y^2\bigr)+2y\bigl((x-1)^2+y^2\bigr)=\ldots\quad.$$
After doing the calculations (and maybe some simplification) we can put down
the differential equation
$$y'=-{F_x(x,y)\over F_y(x,y)}=:f(x,y)\ ,$$
where now $f(x,y)$ is a certain definite expression in $x$ and $y$ not containing the parameter $C$.
Note that the above explanations do not take into account "singular points" or points where the tangent of the considered curve $\gamma_C$ is vertical.
I have tried the integrating factors of the form $P(s,b)$ and $e^{P(s,b)}$, where $P(s,b)$ is an arbitrary polynomial of $s$ and $b$. It is relatively easy to show that neither satisfies the requirements. To shorten the writeup, I will multiply your problem by $s$ and introduce two constants $A$ and $B$. So, ignoring the case $s=0$, original problem is equivalent to
$$(s-b)ds + (Abs + Bb^2s)db = 0$$
We will be looking for solutions $F(s,b)$ that form a full derivative, namely
$$dF(s,b) = \frac{\partial F}{\partial s}ds + \frac{\partial F}{\partial b}db = 0$$
Example 1: $\mu(s,b) = P(s,b)$, a finite degree polynomial
Multiplying the original equation by $\mu$ and comparing it to the full derivative we obtain
$$\frac{\partial F}{\partial s} = \mu(s-b) \;\; \mathrm{and} \;\;\; \frac{\partial F}{\partial b} = \mu (Abs + Bb^2s)$$
We will integrate both equations to obtain $F$, and then equate them to get an expression for $\mu$. We get
$$\int_s \mu(s,b)(s-b)ds + K_1(b) = \int_b \mu(s,b) (Abs + Bb^2s)db + K_2(s)$$
where the integration constants $K_1$ and $K_2$ may depend on the variables that were not integrated over. We observe that all polynomials on the LHS of the equation will be lower order than on the RHS. For example if the degree of $\mu$ is 10, the degree of the LHS is 12 and RHS is 14. So there will always be a term of the form $s^ib^j$ that will not cancel out.
Example 2: $\mu(s,b) = e^{P(s,b)}$
First step is the same, as before. For the 2nd step it is a bit easier to differentiate than to integrate. We will assume that $F$ well-behaves, namely
$$\frac{\partial^2 F}{\partial b \partial s} = \frac{\partial^2 F}{\partial s \partial b}$$
Substituting the first order partial derivatives, we get
$$\frac{\partial}{\partial b} [\mu(s-b)] = \frac{\partial}{\partial s} [\mu(Abs + Bb^2s)]$$
Simplifying, and substituting $\mu = e^P$, we get
$$e^P \bigl[\frac{\partial P}{\partial b}(s-b) - 1 \bigr] = e^P \bigl[(Abs + Bb^2s)\frac{\partial P}{\partial s} + (Ab + Bb^2) \bigr]$$
Cancelling out $e^P$ and moving last term to the LHS, we get
$$\frac{\partial P}{\partial b}(s-b) - 1 - (Ab + Bb^2) = (Abs + Bb^2s)\frac{\partial P}{\partial s} $$
As you can see, the very same problem persists. The polynomial degree of mixed terms of RHS is greater than that of LHS, so equality can not hold.
Conclusion:
I do not think that there is a good strategy for guessing integrating factors if standard choices do not work. Depending on application, series expansion or numerical solution may be of interest
Tough luck...
Best Answer
Find the complementary solution by solving \begin{equation*} y''+y=0. \end{equation*} Substitute $y=e^{\lambda t}$ to get \begin{equation*} (\lambda ^2+1)e^{\lambda t}=0. \end{equation*} Therefore the zeros are $\lambda=i$ or $\lambda =-i.$ The general solution is given by \begin{equation*} y=y_1+y_2=c_1e^{it}+\frac{c_2}{e^{it}}. \end{equation*} Apply Euler's identity and regroup the terms to get \begin{equation*} y=(c_1+c_2)\cos(t)+i(c_1-c_2)\sin(t) \\ =c_3\cos(t)+c_2\sin(t). \end{equation*} For the particular solution, try $y_{b_1}=\cos(t)$ and $y_{b_2}=\sin(t).$ Calculating the Wronskian $W$ gives $1$. Let $f(t)$ be RHS of the differential equation. Use the two formulae \begin{equation*} v_1=-\int \frac{f(t)y_{b_2}}{W},~v_2=\int \frac{f(t)y_{b_1}}{W} \end{equation*} to get the particular solution \begin{equation*} y_p=v_1y_{b_1}+v_2y_{b_2}. \end{equation*}