[Math] Fixed points of a dynamical system

dynamical systemsordinary differential equations

My understanding was that a fixed point of a dynamical system $\dot{x} = f(x)$ is a point $x$ such that $f(x) = 0$ and that if an (autonomous) system starts in a fixed point it will stay there.

But for instance the system defined by $\dot{x} = 2x^{\frac{1}{2}}$ with initial condition $x(0) = 0$ has as a solution $x(t) = t^{2}$, which definitely does not stay at 0, although $2x^{\frac{1}{2}} = 0$ when $x = 0$. so where is my thinking going wrong?

Best Answer

Staying at the fixed point is surely one possible solution of the ODE. But to conclude that the system must stay at the fixed point, you need to know that the solution is unique. (The standard sufficient condition for this is that the right-hand side is a Lipschitz continuous function of $x$; see the Picard–Lindelöf theorem.)

Your example is a classical example of an ODE with non-unique solution. (The right-hand side $2 \sqrt{x}$ is not Lipschitz at $x=0$.)

Related Question