Linear first-order systems of ODEs – Proving linear dependence of a set of function vector solutions on an entire interval from a single point

linear algebraordinary differential equationssystems of equations

In my Analysis course at university, the professor's textbook describes linear first-order systems of ordinary differential equations, and more precisely, the solutions to the homogeneous equation of the form

$$\vec{Y}\,'(t)=A(t)\vec{Y}(t)$$

where $\vec{Y}(t)$ is a vector of $n$ components that are each a function of $t$. Now, the book states that given $n$ solutions to this equation, $\vec{Y}_1(t) … \vec{Y}_n(t)$, one can prove that these solutions are linearly dependent or linearly independent on an entire interval $I$ based on just a single $t_0\in I$ for which the solutions are one of both. I have spent quite some time trying to wrap my head around the proof, and I can't seem to do it, so I was wondering if anyone could reformulate it, or perhaps point me to appropriate search terms so that I can do my own further reading on the internet.

The proof uses the case of linear dependence, and is as follows:

Suppose $\vec{Y}_1(t) … \vec{Y}_n(t)$ are solutions to the homogeneous differential equation above, and for a certain $t_0 \in I$, it holds that

$$\sum_{i=1}^nc_i\;\vec{Y}_i(t_0)=\vec0$$

with not all $c_i=0$. Then, there can be defined a vector function $\vec{Y}_{total}(t)$ that is this linear combination, for which – by linearity – the differential equations above holds. This same $\vec{Y}_{total}(t)$ can then be seen as the solution to the initial value problem $\vec{Y}\,'(t)=A(t)\vec{Y}(t)$ with $\vec{Y}_{total}(t_0)=\vec0$. Thus, because of the uniqueness of a solution, it must be true that this $\vec{Y}_{total}(t)$ is the trivial solution, proving the linear dependence of $\vec{Y}_1(t) … \vec{Y}_n(t)$ on the entire interval $I$.

It is this final step in italics I am just not getting. How so does "uniqueness of a solution" come into play here? It baffles me every time I read it.

Best Answer

Let's assume $A$ is continuous matrix function defined on some interval $[a,b]$ and you are interested in solutions to the linear first order system $\vec{Y}'(t) = A(t) \vec{Y}(t)$. The basic uniqueness result tells you that if you fix a certain point $t_0 \in [a,b]$ and you have two (continuously differentiable) solutions $\vec{Y}_1,\vec{Y}_2$ to the ODE which satisfy $\vec{Y}_1(t_0) = \vec{Y}_2(t_0)$ (that is, they both satisfy the same initial conditions at $t = t_0$), then they must agree on all points of the interval $[a,b]$.

Given this uniqueness result, the logic of the proof is as follows:

  1. Assume you have $n$ solutions $\vec{Y}_1,\dots,\vec{Y}_n$ to the equation which are linearly dependent at $t = t_0$ so that $$ \sum_{i=1}^n c_i \vec{Y}_i(t_0) = \vec{0} $$ for some $c_1,\dots,c_n$, not all zero.
  2. Define $\vec{Y}(t) = \sum_{i=1}^n c_i Y_i(t)$ and $\vec{Z}(t) \equiv \vec{0}$ (the trivial solution). Note that using the linearity of the ODE, both $\vec{Y},\vec{Z}$ satisfy the ODE. That is, $$ \vec{Y}'(t) = A(t)\vec{Y}(t), \vec{Z}'(t) = A(t)\vec{Z}(t). $$
  3. In addition, $\vec{Z}(t_0) = \vec{Y}(t_0) = \vec{0}$. Hence, by uniqueness we must have $\vec{Z}(t) = \vec{Y}(t) = 0$ for all $t \in [a,b]$ so in fact for all $t_1 \in [a,b]$ we have $$ \vec{Y}(t_1) = \sum_{i=1}^n c_i \vec{Y}_i(t_1) = \vec{0} $$ which shows that $(\vec{Y}_1(t_1), \dots, \vec{Y}_n(t_1))$ are also linearly dependent for all $t_1 \in [a,b]$.
Related Question