[Math] Application of Gronwall Inequality to existence of solutions

integral-inequalityordinary differential equationsproof-verification

Consider the $N$-dimensional autonomous system of ODEs
$$\dot{x}= f(x),$$
where $f(x)$ is defined for any $x \in \mathbb{R}^N$, and satisfies $||f(x)|| \leq \alpha||x||$, where $\alpha$ is a positive scalar constant, and the norm $||x||$ is the usual quadratic norm (the sum of squared components of a vector under the square root). Using Gronwall’s inequality, show that the solution emerging from any point $x_0\in\mathbb{R}^N$ exists for any finite time.

Here is my proposed solution.

We can first write $f(x)$ as an integral equation,

$$x(t) = x_0 + \int_{t_0}^{t} f(x(s)) ds$$

where the integration constant is chosen such that $x(t_0)=x_0$. WLOG, assume that $t_0=0$. Then,

\begin{equation}
\begin{split}
||x(t)|| & = ||x_0 + \int_{0}^{t} f(x(s)) ds|| \\
& \leq ||x_0|| + ||\int_{0}^{t} f(x(s)) ds|| \\
& \leq ||x_0|| + \alpha\int_{0}^{t} ||x(s)|| ds
\end{split}
\end{equation}

Therefore, by the integral form of Gronwall's inequality, we see that

\begin{equation}
\begin{split}
||x(t)|| & \leq ||x_0|| + \alpha\int_{0}^{t} ||x(s)|| ds \\
& \leq ||x_0||e^{\alpha(t)}
\end{split}
\end{equation}

So, if we let $M = ||x_0||$, then $||x(t)||\leq{{M}e^{\alpha(t)}}$. Therefore, the solution is uniformly bounded on $[0,t]$ for $t>0$.

As $t>0$ was arbitrary, the solution is defined for all positive values of $t$.

We can then analyze what happens for negative values of $t$ by reversing time and applying the same argument to $[-t,0]$.

Once again assume that $t_0=0$. Then,

$$x(t) = x_0 + \int_{-t}^{0} f(x(s)) ds$$

Therefore,

\begin{equation}
\begin{split}
||x(t)|| & = ||x_0 + \int_{-t}^{0} f(x(s)) ds|| \\
& \leq ||x_0|| + ||\int_{-t}^{0} f(x(s)) ds|| \\
& \leq ||x_0|| + \alpha\int_{-t}^{0} ||x(s)|| ds \\
& \leq ||x_0||e^{\alpha(0+t)} \\
& = {M}e^{\alpha(t)}
\end{split}
\end{equation}

So, the solution is uniformly bounded on $[-t,0]$ for $t<0$.

Combining these two bounds, we see that the solution emerging from any point $x_0\in\mathbb{R}^N$ exists for any finite time.

Is this approach correct? Please let me know if there are any better alternatives.

Best Answer

As explained in the comments, the proposed answer only shows that the solution is bounded between some arbitrary interval $[t_1,t_2]$ where $t_1,t_2,\in\mathbb{R}$. We also need to show that we can extend the solution to any interval of finite length.

To do this, consider Lemma $2.14$ on page $52$ of Teschl.

$\textbf{Lemma 2.14:}$ Let $\phi(t)$ be a solution of $(2.10)$ defined on the interval $(t_-,t_+)$. Then there exists an extension to the interval $(t_-,t_+ + \epsilon)$ for some $\epsilon > 0$ if and only if there exists a sequence $t_m\in(t_-,t_+)$ such that

$$\lim_{m\to\infty}(t_m,\phi(t_m))=(t_+,y)\in{U}. $$

The analogous statement holds for an extension to $(t_- - \epsilon,t_+).$

As $||x(t)||\leq{{M}e^{\alpha(t)}}$, it is clear that $x$ lies in a compact ball. Therefore, by Lemma $2.14$ (and the Bolzano–Weierstrass theorem), we can extend the solution to any interval of finite length.

An alternative argument would be using Corollary $2.16$. I don't like the way Corollary $2.16$ is phrased and have decided to directly apply Lemma $2.14$ instead.

Related Question