[Math] Difference between being “stable” and being “asymptotically stable”

dynamical systemsmathematical modeling

I've learned various definitions for an equilibrium point in a dynamical system when it comes to stability.

There are two definitions that I have a hard time distinguishing between the two. What does it mean when an equilibrium point is "stable" versus when an equilibrium point is "asymptotically stable."

The simple definitions for each consisted of the following:

An equilibrium point is said to be stable if for some initial value close to the equilibrium point, the solution will eventually stay close to the equilibrium point

$$ $$

An equilibrium point is said to be asymptotically stable if for some initial value close to the equilibrium point, the solution will converge to the equilibrium point.

It definitely looks as if being asymptotically stable implies being stable. However my confusion is if the same can be said about the other way around: does being stable imply being asymptotically stable?

"Eventually staying close" seems to be very similar to "converging" in terms of what they mean, but then why do they have different definitions?

Best Answer

No, stable does not imply asymptotically stable. An example of an equilibrium that is stable but not asymptotically stable is the origin for the system

$$ \eqalign{\dot{x} = y\cr \dot{y} = -x\cr}$$ The trajectories are circles centred at the origin. You always keep the same distance to the origin, so if you start close you stay close. But you never converge to the origin (unless you're already there).

You might enjoy looking at these notes (where "attractor" is used instead of "asymptotically stable").

Related Question