Assume $y_1<y_2$ and $f(x,y_1)>0$, $\ f(x,y_2)<0$ for all $x$. By general principles about ODE's any solution $x\mapsto \phi_\eta(x)$ starting at a point $(0,\eta)$, $\ y_1\leq\eta\leq y_2$, will finally leave the rectangle $R:=[0,w]\times[y_1,y_2]$. But it cannot do so along the horizontal edges of this rectangle. It follows that the solution $\phi_\eta$ will pass through a point $(w,\eta')$, $\ y_1\leq\eta'\leq y_2$, on the right edge of $R$. In this way a so-called Poincaré map
$$\Phi:\quad [y_1,y_2]\to[y_1,y_2],\qquad \eta\mapsto \eta'=:\Phi(\eta)$$
is defined. Again by general principles this $\Phi$ is continuous. By Brouwer's fixed point theorem (or using the intermediate value theorem) it follows that $\Phi$ has a fixed point $\eta_*\in[y_1,y_2]$. The solution $\phi_{\eta_*}$ starting at $(0,\eta_*)$ is then periodic.
If instead of the initial assumption on $f$ we have $f(x,y_1)<0$, $\ f(x,y_2)>0$ we start the argument at $x=w$ and proceed to the left.
In the example $y'+p(x) y=q(x)$ we have
$$f(x,y)=p(x)\left({q(x)\over p(x)} -y\right)\ .$$
As $p$ and $q$ are periodic and continuous, and $p(x)\ne0$ for all $x$ there is an $M>0$ such that
$$-M<{q(x)\over p(x)}<M\qquad\forall x\ .$$
It follows that by choosing $y_1:=-M$, $\ y_2:=M$ we can fulfill the assumptions of the "theorem".
Suppose
$y(0) = y_0; \tag{1}$
then the unique solution to the equation
$y' = a(x)y + b(x) \tag{2}$
is
$y(x) = \exp \left ( \displaystyle \int_0^x a(s)ds \right ) \left (y_0 + \displaystyle \int_0^x \exp \left (-\displaystyle \int_0^s a(u) du \right )b(s) ds \right ); \tag{3}$
formula (3) is very well known; a derivation may be found here; for $x = T$ we thus have
$y(T) = \exp \left ( \displaystyle \int_0^T a(s)ds \right ) \left (y_0 + \displaystyle \int_0^T \exp \left (-\displaystyle \int_0^s a(u) du \right )b(s) ds \right ). \tag{4}$
A $T$-periodic solution to (2) satisfies $y(T) = y(0) = y_0$; in this case (4) yields
$y_0 = \exp \left ( \displaystyle \int_0^T a(s)ds \right ) \left (y_0 + \displaystyle \int_0^T \exp \left (-\displaystyle \int_0^s a(u) du \right )b(s) ds \right ), \tag{5}$
which we re-write as
$\left ( 1 - \exp \left ( \displaystyle \int_0^T a(s) ds\right)\right)y_0 = \exp \left ( \displaystyle \int_0^T a(s) ds \right )\displaystyle \int_0^T \exp \left (-\displaystyle \int_0^s a(u) du\right) b(s) ds; \tag{6}$
the hypothesis
$\displaystyle \int_0^T a(s) ds \ne 0 \tag{7}$
guarantees that
$\exp \left ( \displaystyle \int_0^T a(s) ds \right ) \ne 1; \tag{8}$
in this case we may solve (6) for $y_0$:
$y_0 = \dfrac{\displaystyle \int_0^T a(s) ds}{1 - \exp \left (\displaystyle \int_0^T a(s) ds \right )}\displaystyle \int_0^T \exp \left ( \displaystyle \int_0^s a(u) du \right ) b(s) ds. \tag{9}$
(9) indicates that there is at most one initial condition $y_0 = y(0)$ for which the solution $y(t)$ of (2) is periodic. Thus any periodic $y(t)$ satisfying (2) under the condition (7) must be unique; it remains to establish the existence of such a $y(t)$.
To establish the existence of a periodic solution, note that we may translate any solution to (2) forward in $x$ by $T$, obtaining a function $y(x + T)$; we have
$y'(x +T) = a(x + T)y(x + T) + b(x + T) = a(x)y(x + T) + b(x), \tag{10}$
by the $T$-periodicity of $a(x)$ and $b(x)$. Furthermore, at $x = 0$ the function $y(x + T)$ takes the value
$y(0 + T) = y(T). \tag{11}$
If follows from (11) that if we can find a solution $y(x)$ such that
$y(T) = y(0) = y_0, \tag{12}$
then both $y(x)$ and $y(x + T)$ will satisfy (2) with the same initial condition $y_0$; uniqueness of solutions then allows us to conclude that
$y(x + T) = y(x) \tag{13}$
for all $x$; i.e., the solution $y(x)$ is $T$-periodic. But if we choose $y_0$ as in (9), then (6) and hence (5) evidently bind, so that (4) yields (12), and hence (13); we have thus demonstrated the existence of a periodic solution to (2), which must then be unique by the arguments given above. QED.
Best Answer
You have an equation of the form $\phi^\prime = F(x,\phi)$. In your case, $F$ is Lipschitz in $\phi$. Since $a$ is continuous and periodic, we know that it is bounded. Pick $M > 0$ such that $a \leq M$. Then, for any fixed $x$, $$|F(x,\phi_1) - F(x,\phi_2)| = |a(x)\phi_1 - a(x)\phi_2| \leq M |\phi_1 - \phi_2|.$$ We already know that $F$ is continuous in $x$ by hypothesis. This gives you global existence of a unique solution. Since $0$ is a solution, this implies that $\phi \equiv 0$.
The idea is that since $F$ is globally Lipschitz with the same constant, we can glue together local solutions.