Lyapunov stability of autonomous system

control theorydynamical systemslinear algebralinear-control

Consider a discrete-time system of the form

$$x(k+1) = Ax(k)$$

where $x \in \mathbb R^{n \times 1}$ and $A \in \mathbb R^{n \times n}$. This system is (globally) asymptotically stable if the eigenvalues of $A$ are inside the unit circle. If the eigenvalues are inside the unit circle then
the (only) equilibrium point "must" be the origin; $x \to x_e=0$ as $k \to \infty$ (where $x_e$ denotes the equilibrium point). The states $x(k)$ will converge to zero.

What if $x_e \neq 0$? In that case, one of the eigenvalue will lie on the unit circle; $x \to x_e \neq 0$, as $k \to \infty$. Please correct me if I am wrong.

In this case, the equilibrium point will change with initial points, $x_0$ $(x_0 \in R^{n \times 1} )$, that is the initial point $x_{0,1}$ will lead to $x_{e,1, }$, while $x_{0,2} \to x_{e,2}$ and so on…, (also $x_0=0, \to x_e \to 0$).

So we can say that the set of initial values converges to a set of equilibrium points. Since every initial point always converges to a particular equilibrium point the system should be "asymptotically" stable?

In other words, how can I prove the asymptotic stability when there is no fixed equilibrium point (and depends upon the initial values) and an eigenvalue at unit circle.

Best Answer

It can be noted that all but one eigenvalue inside the unit circle and one on the unit circle isn't enough for $x(k)\to x^*\neq0$, as $k\to\infty$. For example $x(k+1)=-x(k)$ keeps changing sign for a nonzero initial condition. A similar thing can also be shown with complex conjugate eigenvalues with nonzero imaginary part. Your statement only hold when the eigenvalue on the unit circle is equal to one.

In order for an equilibrium point to be asymptotically stable all neighboring solutions should eventually converge to the same equilibrium point, which isn't the case. For example for $x(k+1)=x(k)$ only the solution starting at $x(0)=a\in\mathbb{R}$ "converges" to $x^*=a$.

However, there are many different definitions of stability. One of such definitions that does apply to the systems you described is Lyapunov stability.

Related Question