This is not a satisfactory proof of the OP's question. It's a sketch of a proof of a slightly weaker claim (see assumptions below). However, since this question hasn't gotten attention from an expert, as I hoped for in a comment above, then I'll write this down as the best that I can do at the moment. There may be counterexamples to my argument, but I think they would have to be pathological.
One important point: I'm interpreting that the question requires that every point be in a strictly periodic orbit, in the sense that for every initial condition $x(0)$, there will be some $t > 0$ such that $x(t) = x(0)$. I think that this is the sense in which the OP meant that "all solutions...are periodic." (A physicist like me might informally say that all solutions are periodic if they asymptotically approach one or more periodic attractors, which is a completely different situation.)
I'll make the following assumptions that are stronger than the OP's question. These can probably be relaxed, but I have not spent any time trying to do so.
- $f$ is differentiable (not just Lipschitz).
- The Jacobian of $f$ at $x=0$, which I'll call $J$, is not identically zero.
Proof: I will try to prove that your professor's picture is essentially correct, i.e., that initial conditions near the origin follow orbits that remain near the origin. Sufficiently close to the origin,
$$ \dot{x} = f(0) + J x + \dots \approx J x $$
since $f(0)=0$. This linearization has the solution
$$ x(t) = e^{Jt} x(0) $$
as long as $x(t)$ is sufficiently close to $0$.
The eigenvalues of $J$ are the local Lyapunov exponents. Note that these can be complex, but since $J$ is real, they must come in complex conjugate pairs; linear combinations of these are used to create real solutions for $x(t)$. (I will ignore this subtlety below but it doesn't change anything.) In the next paragraph, I try to show that $\text{Re}(\lambda) = 0$ for all eigenvalues $\lambda$.
Suppose that there is some eigenvector $v$ whose associated eigenvalue $\lambda$ has a negative real component, i.e., $\text{Re}(\lambda) < 0$. Suppose we let $x(0) = v$. Then $x(t) = v e^{\lambda t}$, and $x(t)$ will asymptotically approach $0$, a fixed point. Specifically, $x(t)$ will never return to $x(0)$, which we assumed that it would. Note that the linearization of the original ODE is still valid within this entire neighborhood, so we can't say that somewhere along the path, it will get swept off in a different direction and complete a periodic orbit without going to $0$.
Dealing with the case that $\text{Re}(\lambda) > 0$ is slightly harder. This is because, after some time, $x(t)$ will leave the neighborhood where the linearization is valid, and after that we cannot say what will happen. But here I will employ a trick: Make time go backwards. Note that a periodic orbit must remain periodic whether time runs forward or backward. Now consider the original ODE,
$$ \dot{x} = f(x) $$
under the substitution $t \to -t$ (running time backwards). We get the ODE
$$ \dot{x} = -f(x) $$
which corresponds to taking $J \to -J$ and $\lambda \to -\lambda$ in the discussion above. So if we have $\text{Re}(\lambda) > 0$ when time is running forward, we can run time backward and apply the same argument as we did for $\text{Re}(\lambda) < 0$.
Note that because we can choose the initial condition to be exactly on the eigenvector, the above discussion applies even if $J$ has both positive and negative eigenvalues, because we are singling out the effect of a single eigenvalue.
We have shown that the cases where $x(t)$ move rapidly toward or away from the origin are excluded, so $x(t)$ must remain at approximately the same distance from the origin. In particular, the eigenvalues of $J$ must be pure imaginary, so $x(t)$ will be a sum of oscillating components. In particular, if $x(0) = \sum_i c_i v_i$ for eigenvectors $v_i$, then $x(t) = \sum c_i e^{\lambda_i t} v_i$. Obviously $|x(t)|$ can change, but it is bounded by the triangle inequality.
The main place where this argument needs more rigor is that one would need to do an $\epsilon$-$\delta$ analysis (or invoke relevant theorems) to validate that we really can ignore the effect of the higher-order terms in the Taylor series. But I think the basic outline is sound.
Best Answer
If we define $\eta = u + i v$, then the set of equations can be rewritten as $$ 0 = - \frac{2}{\omega^2} \ddot{\eta} + 3 e^{2 i \omega t} \bar{\eta} + \eta. $$ If $\eta$ is periodic with period $2 \pi/\omega$, then it will be expressible as a power series of the form $$ \eta = \sum_{m = - \infty}^\infty a_m e^{i m \omega t}. $$ Putting this ansatz into the ODE and manipulating the series appropriately, we find that if $\eta$ is of this form and satisfies the above ODE, then we must have $$ (1 + 2 m^2) a_m + 3 a^*_{2-m} = 0 $$ for all $m$.
This "recursion" relation relates the coefficients $a_m$ in pairs; it relates $a_1$ to itself, $a_0$ to $a_2$, $a_{-1}$ to $a_3$, and so forth. Note that all of these pairs of coefficients are determined independently: the value of $\{a_0, a_2\}$ are unaffected by the values of $\{a_{-1}, a_3\}$ or any other pair. Moreover, if we set $m \to 2-m$ in the above recursion relation and conjugate it, we obtain $$ (1 + 2(2-m)^2) a^*_{2-m} + 3 a_m = 0, $$ and combining the above two equations yields $$ \left[ (1 + 2m^2) (1 + 2(2-m)^2) - 9 \right] a_m = 0 $$ for all $m$. Assuming we want $a_m \neq 0$, this implies the quantity in brackets must vanish. But the only roots of this polynomial are $m = 0, 1, 2$ (with 1 a double root). Thus, the only $a_m$ coefficients that can be non-zero are $a_0$, $a_1$, and $a_2$. Taking these cases separately:
For $a_1 \neq 0$, the recursion relation is $3 a_1 + 3 a^*_1 = 0$, which implies that $a_1$ is pure imaginary. Thus, the solution is $\eta = A i e^{i \omega t}$ for $A \in \mathbb{R}$, which (taking the real and imaginary parts) yields the solution you found: $u(t) = - A \sin(\omega t)$, $v(t) = A \cos (\omega t)$.
For the pair $\{a_0, a_2\}$, we have $$a_0 = - 3 a_2^*.$$ Thus, we have a general solution of the form $$ \eta(t) = a_0 - \frac{1}{3} a_0^* e^{2 i \omega t} $$ for an arbitrary complex number $a_0$. Expressing this in terms of two real coefficients $a_0 = A + iB$, the solution then becomes $$ u(t) = A\left( 1 - \frac{1}{3} \cos (2 \omega t)\right) - \frac{B}{3} \sin (2 \omega t), \\ v(t) = B\left( 1 + \frac{1}{3} \cos (2 \omega t) \right) - \frac{A}{3} \sin (2 \omega t). \\ $$
The solutions found by Robert Israel correspond to $B = -3, A = 0$ and $A = -3, B = 0$ respectively.
These three independent solutions, and linear combinations thereof, are the only possible solutions with a period of $2\pi/\omega$.