[Math] Stability of autonomous linear systems of ODEs

ordinary differential equations

Let $A$ be an $n\times n$ real matrix, and let's consider the linear system of ODEs $x'=Ax$.

I'm trying to characterize the Lyapunov stability of the origin according to the real part of the eigenvalues of $A$. Gerard Teschl's Ordinary Differential Equations and Dynamical Systems, p.68, reads:

0 is stable iff every eigenvalue $\lambda$ of $A$ satisfies $Re(\lambda)\leq 0$, and for all eigenvalues such that $Re(\lambda)=0$ the corresponding algebraic and geometric multiplicities are equal.

0 is asymptotically stable iff every eigenvalue $\lambda$ of $A$ satisfies $Re(\lambda)<0$.

I'm not convinced at all of his proof; I find it somewhat sketchy and the author excessively chatty for my taste. So what I ask in this question is

1) a full proof of this statement.

I have tried the following: I've found the following

Lemma: if $\lambda=a+ib$ is an eigenvalue of $A$, then there exist $u,v\in \mathbb{R}^n$ non-zero such that (1): $e^{At}u=e^{at} (\cos(bt)u+\sin(bt)v)$, $e^{At}v=e^{at}(\cos(bt)v-\sin(bt)u)$.

which I was not able to prove, so I also ask of you

2) a full proof of this lemma.

Using this lemma, it's not hard to prove that if 0 is stable, then every eigenvalue $\lambda$ of $A$ must verify $Re(\lambda)\leq 0$.

Indeed, suppose there exists $\lambda=a+ib$ eigenvalue of $A$ with $a>0$. Then, by the lemma there are vectors $u,v$ satisfying (1). By scaling the equation (1) we can get vectors $u,v$ of arbitrarily small norm to satisfy (1). So we get solutions arbitrarily close to the origin which are of the form $e^{at} (\cos(bt)u+\sin(bt)v)$. Since $a>0$, by taking limits we see that the norm of this solution tends to infinity as $t$ grows larger, thus $0$ would be unstable.

Best Answer

I give you some hints, I hope it helps. For the lemma: If $\lambda$ is an eigenvalue, then there is a complex eigenvector. Your $u$ and $v$ are exactly the real and imaginary parts of this eigenvectors.

For the Theorem. It is easy if you allow a bit more linear algebra. So assume that everything is complex. Then we can assume without loss of generality that the matrix $A$ is in Jordan normal form.

The solution of the ODE can be represented in general using the matrix exponential function: if your initial value was $x_0\in\mathbb{R}^n$, then the solution wil be $$x(t)=e^{At}x_0.$$ The exponential is usually defined by the power series. Now, if your matrix is in Jordan normal form, then you can calculate this power series. More precisely, if $$A=\begin{pmatrix} \lambda & 1 & 0 & \cdots & 0 \\ 0 & \lambda & 1 & \cdots & 0 \\ \vdots & & & & \vdots \\ 0 &\cdots & &\cdots & \lambda\end{pmatrix} = \text{diag}(\lambda) + N,$$ where $N$ is nilpotent, then $$e^{tN} = \begin{pmatrix} 1 & t & \frac{t^2}{2} & \cdots & \frac{t^{n-1}}{(n-1)!} \\ 0 & 1 & t & \cdots & \frac{t^{n-2}}{(n-2)!} \\ \vdots & & & & \vdots \\ 0 & \cdots & & \cdots & 1 \end{pmatrix}$$, and $$e^{tA} = e^{t\lambda}e^{tN}.$$

You can see now that

  • If you have $\Re\lambda<0$, then you have asymptotic stability.
  • If you have $\Re\lambda>0$, then you have no hope to get stability.
  • Hence, it is necessary to have $\Re\lambda\leq 0$ for stability.
  • You can immediately see from the formulae that if $\Re\lambda=0$, then you cannot allow a Jordan block which is greater then 1, because polynomial terms appear in the exponential. This shows the point on the geometric and algebraic multipicities.

I hope this helped.