[Math] Using contraction mapping theorem to prove existence/uniqueness of solutions of Linear first order ODEs

ordinary differential equations

In class we used the contraction mapping theorem to prove the existence and uniqueness of solutions to a first order (not necessarily linear) ODE on some interval [0,h].

The method we used was this: First convert the linear ODE into an integral equation of the form $u=f(t)+\int_0^s k(t,s)g(t,u(s)) \ ds$. and define an operator $Tu=f(t)+\int_0^s k(t,s)g(t,u(s)) \ ds$. Then use the contraction mapping theorem to ensure $T$ has a fixed point (and thus the integral equation is uniquely solved) on some small enough interval $[0,h]$.

My question is this:

Suppose I restrict the ODE to be a linear ODE only. That is, the operator is now
$T=f(t)+\int_0^s k(t,s)u(s) \ ds$. The above contraction mapping still gives us a unique solution on $[0,h]$. Using this fact, how can I show that there is a unique solution for $[h,2h]$ and, therefore, for all intervals $[0,k]$?

Best Answer

Consider the initial value problem for a firs order ODE $$ y'=g(t,y),\quad y(t_0)=y_0. $$ This is transformed into de integral equation $$ y(t)=y_0+\int_{t_0}^tg(s,y(s))\,ds. $$ If $g$ is continuous and Lipschitz in the $y$ variable, using the contraction principle, one shows that there is a unique solution on an interval $[t_0,t_1]$, $t_1>t$. The length of the interval, $t_1-t_0$, depends on $g$ and the initial value $y_0$. One can apply the same procedure with the initial time $t_1$ (and initial value $y(t_1)$) and extend the solution to a larger interval $[t_0,t_2]$. In general $t_2-t_1$ may be smaller that $t_1-t_2$. But if $g$ satisfies an inequality like $|g(t,y_1)-g(t,y_2)|\le M\,|y_1-y_2|$, then it is possible to show that the length of the interval in which the solution is defined is independent of the initial value. This happens in particular for linear equations, in which $g(t,y)=a(t)\,y$.