[Math] Notions of stability for differential equations

analysisdynamical systemsordinary differential equations

Consider a system of differential equations $$\dot{x} = f(x,u)$$ $$y = h(x,u)$$
where $x(t), u(t)$ are vectors in some $\mathbb{R}^n$. We define the infinity norm of a function in more-or-less in the usual way $$||z(t)||_{\infty} = \sup_{t \geq 0} ||z(t)||_{\infty}$$ where the infinity norm of a vector is the absolute value of its largest entry.

This system of differential equations is called BIBO (bounded-input, bounded-output) stable if every $u$ with bounded infinity norm results in $y$ with bounded infinity norm, regardless of the initial condition $x(0)$. It is called ${\mathcal L}_{\infty}$ stable if we have $$||y||_{\infty} \leq g(||u||_{\infty}) + q(x(0))$$ where $g: \mathbb{R} \rightarrow \mathbb{R}, q: \mathbb{R}^n \rightarrow \mathbb{R}$ are some (finite valued) functions.

My question: is it true that a BIBO stable system is ${\mathcal L}_{\infty}$ stable?

This is really a question about compactness, taking a converging subsequence carefully – I'm having some trouble doing that. If it were true that BIBO stable systems are $L_{\infty}$ stable, we would need to rule out the possibility that while every $u$ with bounded infinity norm results in $y$ with bounded infinity norm, there is no uniform bound on how large these norms get.

Motivation: Khalil's textbook Nonlinear Systems has a confusing sentence about these two notions: on page 198 of the third edition,

The definition of $L_{\infty}$ stability is the familiar notion of bounded-input-bounded-output stability; namely, if the system is ${\mathcal L}_{\infty}$ stable, then for every bounded input $u(t)$, the output…is bounded.

The part before the semicolon seems to suggest the two notions are equivalent, while the part afterwards suggests that the implication was meant only in one direction.

Furthermore: does the answer depend on assumptions on $f$ and $h$? For example, the $f,h$ I'd like to apply this to are differentiable, but not Lipschitz over all of $\mathbb{R}^n$. Does it make a difference if we assume these functions are differentiable infinitely many times as well as Lipschitz over $\mathbb{R}^n$?

Best Answer

$\newcommand{\CC}{{\mathbb{C}}}\newcommand{\RR}{{\mathbb{R}}}\newcommand{\ra}{\rightarrow}\newcommand{\ds}{\displaystyle}$The implication is not true without additional conditions upon the functions. It is not even true for $f,h$ independent of a function $u$. Therefore the problem is not (only) the compactness in the space of bounded functions.
Let me first recall a counterexample of stability theory that impressed me very much as a student.

Consider $r : \RR \times [0,1] \ra \RR$ given by $$r(t,a) = \frac{1+2a t^4}{1+t^2+a^3 t^6} \ \ \mbox{ for }t \in\RR, a \in [0,1].$$ Then $r$ is a positive function of class ${\mathcal C}^\infty$ and we have $\ds\lim_{t \ra +\infty} r(t,a) = 0\mbox{ for }a \in [0,1].$ The convergence is not uniform with respect to $a \in [0,1]$, however, because $\ds\lim_{a \ra 0} a^{1/2} r(a^{-3/4},a) = 1.$ Now define $s : \RR^3 \ra \RR^2$ by $$s(t,x) = \left(\frac{1}{r} \frac{\partial r}{\partial t}\right) \left(t,\frac{x^2 _2}{x^2 _1 + x^2 _2}\right) \cdot x {\rm ,if} \ \ x = (x_1, x_2) \neq 0,$$ whereas $s(t,(0,0))=0$. $s$ is continuous and satisfies a local Lipschitz condition with respect to $x$ (In the neighborhood of $x=(0,0)$, this is a bit tricky to verify. For the uniqueness of solutions to the initial value problems this is not needed, anyway). Then the initial value problem $$z' = s(t,z),\ \ z(t_0) = b\mbox{ with }t_0\in\RR, b \in \RR^2$$
has a unique solution and for $b= (b_1,b_2)\neq 0$ this solution is given by $$z(t) = \frac{r(t,a_0)}{r(t_0, a_0)} b \ \ {\rm with} \ \ a_0 = \frac{b^2 _2}{b^2 _1 + b^2 _2}.$$ Therefore all solutions of the differential equation tend to $0$ as $t\to+\infty$, but given $\varepsilon,M>0$, $t_0\in\RR$ there exists $b\in\RR^2$, $|b|<\varepsilon$ such that the solution of $z' = s(t,z),\ \ z(t_0) = b$ satisfies $||z||_\infty>M$. It is sufficient to choose first $a_0>0$ small enough such that $r(a_0^{-3/4},a_0)>2M\,r(t_0,a_0)/\varepsilon$, then $B$ with $|B|=1$ such that $a_0=\frac{B^2 _2}{B^2 _1 + B^2 _2}$ and finally $b=\frac\varepsilon2B$.

Now we adapt this example to the given $x,y$-system; more precisely, to one independent of $u$. With the function $s$ defined above, we put essentially $t=x_3$: $$f:\RR^3\to\RR^3,\ f(x_1,x_2,x_3)=(s(x_3,(x_1,x_2)),1)\mbox{ and }h:\RR^3\to\RR,\ h(x_1,x_2,x_3)=x_1^2+x_2^2.$$ The solution of $\dot x=f(x)$, $x(0)=x_0=(x_{01},x_{02},x_{03})$ is then $x_3(t)=x_{03}+t$ and $z(u)=(x_1(u-x_{03}),x_2(u-x_{03}))$ satisfies $$\frac{dz}{du}=s(u,z),\ \ z(x_{03})=(x_{01},x_{02}).$$ As seen above, for all solutions $x(t)$, the output $y(t)=x_1^2(t)+x_2^2(t)=||z(t+x_{03})||_2^2$ is bounded, but given $\varepsilon,M>0$, $x_{03}\in\RR$ there exists $(x_{01},x_{02})\in\RR^2$, of norm smaller than $\varepsilon$ such that the solution of $\dot x=f(x),\ x(0)=x_0, y=h(x)$ satisfies $||y||_\infty>M$. Therefore a function $g$ as desired in the question cannot exist.
Of course, the function $f$ in this example is not very smooth, but it seems to me that smoother examples could also be constructed.