I'm given a homogeneous set of differential equations: $$x^\prime = ax + by$$ $$y^\prime = cx + dy$$ I already know that two solutions of this system are $x_1, y_1$ and $x_2, y_2$, and I am asked to prove that $x_h = c_1 x_1 + c_2 x_2, y_h = c_1 y_1 + c_2 y_2$ is also a solution. Obviously we can prove this in a general case, but I was curious if we could solve this by direct substitution. I've already tried a few approaches but I'm just left with a jumbled mess of equations and a few relations that don't prove very useful. Could someone give me a hint as to how to approach this (this is in the first section of the chapter, so the solution to this problem does not use elimination, any matrix approach, etc.)?
[Math] Proving the linear combination of two solutions of a system of differential equations is also a solution
ordinary differential equations
Related Solutions
The space of solutions of a non-homogeneous linear differential equations is an affine space with direction the vector space of solutions of the associated homogeneous equation.
Indeed, if $y_1$ and $y_2$ are two solutions of the inhomogeneous equation, $y_1-y_2$ is a solution of $$a(x)y''(x)+by'(x)+c(x) y(x)=f(x)-f(x)=0 $$ Conversely, if y_2(x) is a solution of the inhomogeneous equation and $z(x)$ a solution of the associated homogeneous equation, it is easy to check $y_1(x)=y_2(x)+z(x)$ is a solution of the inhomogeneous equation.
Thus to completely solve a linear inhomogeneous differential equation you have to: 1) completely solve the linear homogeneous associated equation; 2) find one solution of the inhomogeneous equation; 3) add any solution of 1) to the particular solution of 2).
The solution to the vector differential equation $\mathbf y'(x)=A\mathbf y(x)$ is, not surprisingly, $e^{xA}\mathbf C$, where $\mathbf C$ is a vector of constants determined by boundary conditions. The exponential of a matrix is defined via a power series, but in practice one doesn’t use that to compute it.
If $A$ is diagonalizable, it can be decomposed as $B\Lambda B^{-1}$, where $B=\begin{bmatrix}\mathbf b_1,\cdots,\mathbf b_n\end{bmatrix}$ with eigenvectors of $A$ as its columns and $\Lambda=\operatorname{diag}(\lambda_1,\dots,\lambda_n)$ with $\lambda_j$ the eigenvalues corresponding to $\mathbf b_j$. Just as $A^k=B\Lambda^kB^{-1}=B\operatorname{diag}(\lambda_1^k,\dots,\lambda_n^k)B^{-1}$, so, too $e^{xA}=Be^{x\Lambda}B^{-1}=B\operatorname{diag}(e^{\lambda_1x},\dots,e^{\lambda_nx})B^{-1}$. So, the solution to the equation is $$e^{xA}\mathbf C=Be^{x\Lambda}B^{-1}\mathbf C.$$ Since $B$ has full rank, $B^{-1}\mathbf C$ is also a vector of arbitrary constants, so we can expand the above expression as $$C_1e^{\lambda_1x}\mathbf b_1+\cdots+C_ne^{\lambda_nx}\mathbf b^n.\tag1$$
In this problem, $$A=\begin{bmatrix}-2&1&-2\\1&-2&2\\3&-3&5\end{bmatrix}.$$ Its eigenvalues can be found to be $3$, $-1$ and $-1$. We compute the kernel of $3I-A$ via row-reduction: $$\begin{bmatrix}5&-1&2\\-1&5&-2\\-3&3&-2\end{bmatrix}\to\begin{bmatrix}1&0&\frac13\\0&1&-\frac13\\0&0&0\end{bmatrix}$$ so an eigenvector of $3$ is $(-1,1,3)^T$. Moving to $A+I$, $$\begin{bmatrix}-1&1&-2\\1&-1&2\\3&-3&6\end{bmatrix}\to\begin{bmatrix}1&-1&2\\0&0&0\\0&0&0\end{bmatrix}$$ from which we get $(1,1,0)^T$ and $(-2,0,1)^T$ as linearly independent eigenvectors. Plugging these values into (1), the solution to the equation is therefore $$C_1e^{3x}\begin{bmatrix}-1\\1\\3\end{bmatrix}+C_2e^{-x}\begin{bmatrix}1\\1\\0\end{bmatrix}+C_3e^{-x}\begin{bmatrix}-2\\0\\1\end{bmatrix}=\begin{bmatrix}-C_1e^{3x}+(C_2-2C_3)e^{-x}\\C_1e^{3x}+C_2e^{-x}\\3C_1e^{3x}+C_3e^{-x}\end{bmatrix}.$$
Best Answer
$$x^{'}_h=ax_h+by_h \rightarrow (c_1x_1+c_2x_2)'=a(c_1x_1+c_2x_2)+b(c_1y_1+c_2y_2)$$ $$y^{'}_h=cx_h+dy_h \rightarrow (c_1y_1+c_2y_2)'=c(c_1x_1+c_2x_2)+d(c_1y_1+c_2y_2)$$ $$c_1x_1^{'}+c_2x_2^{'}=c_1(ax_1+by_1)+c_2(ax_2+by_2)$$ $$c_1y_1^{'}+c_2y_2^{'}=c_1(cx_1+dy_1)+c_2(cx_2+dy_2)$$ Subtracting the two equations we have: $$c_1(x_1^{'}-y_1^{'})+c_2(x_2^{'}-y_2^{'})=c_1(ax_1+by_1-cx_1-dy_1)+c_2(ax_2+by_2-cx_2-dy_2)$$ $$c_1(x_1^{'}-y_1^{'}-(ax_1+by_1-cx_1+dy_1))=c_2(ax_2+by_2-cx_2-dy_2-(x_2^{'}-y_2^{'}))$$ Both the LHS and RHS in parenthesis are equal to zero therefore 0=0. And this is true for arbitrary values of $c_1$ & $c_2$.