[Math] Linear independence of a set of solutions and the Wronskian

linear algebraordinary differential equationsreal-analysis

Consider a general $n$th order linear equation
$$x^{n}(t)+a_{n-1}x^{n-1}(t)+ \dots + a_{1}x'(t) + a_{0}x(t)=0\tag{$*$}.$$
Let $x_1, x_2 , \dots , x_n$ be a fundamental set of solutions of above and set $W(t)=W(x_1, x_2 , \dots , x_n ; t).$

Question. Show that a set of solutions $x_1 , x_2 , \dots , x_k$ of $(*)$ are linearly independent over $(-\infty, \infty)$ if and only if their Wronskian $W(x_1 , x_2 , \dots , x_k; t_0) \neq 0$ for some $t_0 \in (-\infty, \infty).$ Also show that those solutions form a vector space of dimension $n$.

My approach: Writing the equivalent first order system,
$$y_1=x ,~y_2=x' ,~\dots~,y_n=x^{(n-1)},$$
from which we get
$$y_1'=y_2,~~y_2'=y_3,~~\dots~~,y_{n-1}'=y_n,~~y_n'=-a_{n-1}(t) y_n- \cdots – a_{1}(t) y_2-a_{0}(t) y_1.$$

For the contrapositive statement: i.e., if $W(x_1 , x_2 , \dots , x_k; t_0) = 0,$ for some $t_0 \in (-\infty, \infty),$ doesn't that clearly implies that the set of vectors $\{ x_1 , x_2 , \dots , x_k \}$ is linearly dependent.

I'm stuck in progressing any further. Any help in proving this is much appreciated.

Best Answer

I'll assume all of the coefficients $a_k$ are continuous on the interval of interest $I$.

Theorem: [Existence and Uniqueness] Let $a_0,a_1,\cdots,a_{n-1}$ be fixed constants, and let $t_0$ be a point in the interval of interest $I$ for the ordinary differential equation stated in your problem. Then there exists a unique solution $x(t)$ defined on $I$ such that $$ x^{(k)}(t_0)=a_k,\;\;\; k=0,1,2,\cdots,n-1. $$ Proof: Use Picard iteration to establish existence and uniqueness.

Theorem: [Wronskii] Let $\{ x_1,x_2,\cdots,x_n \}$ be a set of solutions of the ODE. The Wronskian $W(x_1,x_2,\cdots,x_n)$ vanishes at some point $t_0$ of the interval of interest $I$ iff $\{ x_1,x_2,\cdots,x_n \}$ is a linearly dependent set of functions on $I$.

Proof: If the set of functions $\{ x_1,x_2,\cdot,x_n \}$ is a linearly dependent set of functions on $I$, then there are constants $\alpha_1,\alpha_2,\cdots,\alpha_n$ such that $$ \alpha_1 x_1(t)+\alpha_2 x_2(t)+\cdots +\alpha_n x_n(t) = 0,\;\;\; t\in I. $$ By differentiating this equation $n-1$ times, one obtains the matrix equation $$ \left[\begin{array}{cccc} x_1(t) & x_2(t) & \cdots & x_n(t) \\ x_1'(t) & x_2'(t) & \cdots & x_n'(t) \\ \vdots & \vdots & \ddots & \vdots \\ x_1^{(n-1)}(t) & x_2^{(n-1)}(t) & \cdots & x_n^{(n-1)}(t) \end{array}\right] \left[\begin{array}{c} \alpha_1 \\ \alpha_2 \\ \vdots \\ \alpha_n\end{array}\right] = 0. $$ The existence of a non-trivial solutions $[\alpha_j]$ forces the determinant of the coefficient matrix--which is the Wronskian--to vanish for all $t$. Therefore the Wronskian of the solutions $x_1,x_2,\cdots,x_n$ vanishes identically if the solutions are linearly dependent.

Conversely, suppose that the Wronskian $W(x_1,x_2,\cdots,x_n)$ vanishes at some $t \in I$. Then there are constants $\alpha_1,\alpha_2,\cdots,\alpha_n$ such that the matrix equation of the previous paragraph holds. Hence, $$ x(t) = \alpha_1 x_1(t)+\alpha_2 x_2(t) + \cdots + \alpha_n x_n(t) $$ is a solution of your ODE which satisfies $x(t)=x'(t)=\cdots=x^{(n-1)}(t)=0$. By uniqueness of solutions, $x\equiv 0$, which proves that the set of functions $\{x_1,x_2,\cdots,x_n\}$ is a linear independent set of functions. $\blacksquare$

To see that the set of solutions is an n-dimensional vector space, let $S$ be the set of solutions on the interval $I$, and let $t \in I$. Then the map $$ M : S \rightarrow \mathbb{R}^{n} $$ defined by $$ Mx = \left[\begin{array}{c}x(t) \\ x'(t) \\ \vdots \\ x^{(n-1)}(t)\end{array}\right] $$ is linear. This map is injective because $Mx=0$ iff $x\equiv 0$ by uniqueness of solutions. The map is surjective because of the existence of solutions. So $S$ is $n$-dimensional because $M$ is a linear isomorphism.

Related Question