Prove linear combination solves homogeneous equation

homogeneous equationlinear algebranomographyordinary differential equationswronskian

  • I have a third-order homogeneous linear differential equation:
    $$A_3(u) f^{\prime\prime\prime} + A_2(u) f^{\prime\prime} + A_1(u) f^\prime + A_0(u) f = 0,$$

    with three linearly-independent solutions $\phi_1, \phi_2, \phi_3$.

  • I have a function $F$ which is a linear combination of the $\phi_i$ and hence solves the differential equation. It happens that $F$ is also a linear combination $F = c_1 g_1(u) + c_2 g_2(u) + c_3 g_3(u) $ of an unrelated set of three functions, which are themselves linearly-independent.

  • I am trying to establish that each $g_i$ must itself solve the differential equation (or under what conditions this holds).

So far, I've tried using matrix notation, writing the wronskians:

$$\Phi \equiv \begin{bmatrix}\phi_1 & \phi_2 & \phi_3\\ \phi_1^\prime & \phi_2^\prime & \phi_3^\prime \\ \phi_1^{\prime\prime} & \phi_2^{\prime\prime} & \phi_3^{\prime\prime}\end{bmatrix}$$

$$G \equiv \begin{bmatrix}g_1 & g_2 & g_3\\ g_1^\prime & g_2^\prime & g_3^\prime \\ g_1^{\prime\prime} & g_2^{\prime\prime} & g_3^{\prime\prime}\end{bmatrix}$$

If the $\phi_i$ are linearly-independent then $\Phi$ is nonsingular/invertible; similarly for the $g_i$ and $G$. So we can write

$$G = (\Phi\Phi^{-1})G = \Phi(\Phi^{-1}G)$$

Then the columns of $(\Phi^{-1}G)$ should be coefficients which show that each $g_i$ is a linear combination of $\phi_i$ and hence a solution to the differential equation.

My problem is I don't see why the entries of $\Phi^{-1}G$ should be constants, as I'm hoping, rather than functions of $u$ in their own right.


If it helps, my third order homogeneous linear differential equation is not arbitrary, but instead comes from the definition of $F$ as a linear combination $F = c_1 g_1 + c_2 g_2 + c_3 g_3$. This equation shows that $F$ and the $g_i$ linearly depend on each other. The first three derivatives of this equation show that $F^\prime$ linearly depends, with the same coefficients, on the $g_i^\prime$, and so on.

Hence this wronskian vanishes because its columns are linearly-dependent:
$$\det\begin{bmatrix}F & g_1 & g_2 & g_3 \\ F^{\prime} & g_1^{\prime} & g_2^{\prime} & g_3^{\prime} \\ F^{\prime\prime} & g_1^{\prime\prime} & g_2^{\prime\prime} & g_3^{\prime\prime} \\ F^{\prime\prime\prime} & g_1^{\prime\prime\prime} & g_2^{\prime\prime\prime} & g_3^{\prime\prime\prime}\end{bmatrix} = 0$$

Expanding the determinant along the first column yields a third-order homogeneous linear differential equation satisfied by $F$. (The coefficient on $F^{\prime\prime\prime}$ is nonzero because the three $g_i$ are linearly-independent.) The $\phi_i$ are then three linearly-independent solutions to this equation.

I can imagine you might plug in each solution $\phi_i$ in for $F$ in this determinant equation to obtain three equations each of which proves that $\phi_i$ is a linear combination of the $g_j$. So you have $\Phi = G A$, where $A$ is a matrix of only constants. But in this case, I am not sure why $A$ is invertible so that we can obtain $G = \Phi A^{-1}$.

Is it enough to say that $\det{\Phi} \neq 0$, and $\det{\Phi} = \det{G}\det{A}$ so $\det{A} \neq 0$? Can you find the coefficients?

Best Answer

I figured it out. The differential equation is:

$$\det\begin{bmatrix}y & g_1 & g_2 & g_3 \\ y^{\prime} & g_1^{\prime} & g_2^{\prime} & g_3^{\prime} \\ y^{\prime\prime} & g_1^{\prime\prime} & g_2^{\prime\prime} & g_3^{\prime\prime} \\ y^{\prime\prime\prime} & g_1^{\prime\prime\prime} & g_2^{\prime\prime\prime} & g_3^{\prime\prime\prime}\end{bmatrix} = 0$$ where $y$ represents the unknown.

$F$ satisfies this equation by assumption. But if we replace $y$ with any $g_i$ in the matrix, we get a matrix with two matching columns. Hence the columns are linearly dependent, so the determinant vanishes, which is equivalent to saying that $g_i$ solves the equation.

You can find the coefficients by finding three points $x_i$ such that $[g_1(x_i), g_2(x_i), g_3(x_i)]$ are linearly independent. Then solve the matrix equation $[g_j(x_i)]_{i,j} \cdot [c_j] = [y(x_i)]_i$ for the unique solution. The solution exists and is unique because the $g_i$ are independent and the constants make the vectors $g_j(x_i)$ independent of one another.

Related Question