Linear ODEs and Maximal Solutions Forming a Linear Space – Differential Equations

differential equations

Let $I$ be a non trivial interval of $\mathbb R$, let $f : I \times \mathbb R^n \to \mathbb R^n$ and consider the following ordinary differential equation (ODE):
\begin{equation}\tag{$\mathscr E$}\label{ode}
y'(t) = f\big(t,y(t)\big)
\end{equation}

Suppose that:

  1. all the maximal solutions of \eqref{ode} are global (defined on $I$) ;
  2. and the set $S$ of such global solutions is a linear space.

Is it true that \eqref{ode} is linear ? i.e. that it exists $A : I \to \mathrm M_n(\mathbb R)$ (the square matrices of size $n \times n$) such that every differentiable function $y: I \to \mathbb R^n$ is a solution of \eqref{ode} if and only if it is a solution of the linear ODE
\begin{equation}\tag{$\mathscr L$}\label{ode2}
y'(t) = A(t)\,y(t)
\end{equation}

Remark. I did not make any assumptions on $f$ and on the dimension of $S$ but if needed, we can assume for $f$ that the Picard–Lindelöf theorem holds ($f$ is continuous in $t$ and Lipschitz continuous in $y$, or even one can assume that $f$ is $\mathscr C^1$) and we can assume that $\dim S = n$, but maybe 1 and/or 2 implies these assumptions.

Best Answer

This is a partial solution, assuming that $f$ is analytic on $I\times R$, and so solutions are analytic as well, and that the space $E$ of solutions is of the same dimension $n$ as vectors $y$ and $f$.

We represent solutions $y$ and the function $f$ in the right hand side as column vectors, as usual. Let $y_1,\ldots,y_n$ be a basis in $E$. Then every vector $y\in E$ can be written as a linear combination $$y=c_1y_1+\ldots+c_ny_n.\quad\quad\quad\quad\quad\quad\quad\quad (1)$$ Solving this by Cramer's rule we obtain $c_k$ as a ratio of two determinants made of coordinates of $y$ and $y_j$. Since the coordinates of $y$ make one column in the numerator, it is clear that this ratio is a linear function of coordinates of $y$, so we can write $$c_j=a_j(t)y,\quad\quad\quad\quad\quad\quad\quad\quad\quad\quad\quad\quad (2)$$ for some row vector $a_j$ which depends on $t$. Now, since all elements of $E$ satisfy your non-linear equation we must have for every $y\in E$ of the form (1): $$y'=\sum_{j=1}^n c_jy_j^\prime=\sum_{j=1}^nc_jf(t,y_j),$$ Substituting our formula for $c_j$ we obtain that $$y'=A(t)y,\quad\mbox{where}\quad A(t)=\sum_jf(t,y_j(t))a_j(t)$$ (column vectors $f(t,y_j(t))$ times row vectors $a_j$ give you $n\times n$ matrices.) There is a little problem with the denominator in the Cramer Rule: one has to show that it cannot be zero at some point. If all functions $f,y_j$ are analytic, this can only happen at isolated points. Away from these isolated points $t_k$ both equations $y'=f(t,y)$ and $y'=A(t)y$ have the same set of solutions, therefore $f(t,y)=A(t)y$ on some open set of $(t,y)$, and since all functions are analytic they must coincide.

Related Question