[Math] How to show a set of continuous functions is linearly independent using a matrix

linear algebra

Let $$ S = \{{2\sin^{2}x, 3\cos^{2}x, \cos(2x) }\} $$ be a subset of the set of continuous functions on $R$. Is S linearly independent?

Attempt: As stated and I am ashamed to say it I am having trouble trying to solve for linear independence of this sort of vector space. I usually solve these in the form of a matrix. So in order for this set to be linearly independent it means that there must exist coefficients $a_n \neq 0$ such that: $$a_1(2\sin^{2}x) + a_2(3\cos^{2}x) + a_3\cos(2x) = 0(x)$$ where $0(x)$ is the zero function.

Now the solution says that the coefficients are: $a_1 = 3, a_2 = -2, a_3 = 6$ How would I obtain these if I set up the expression in matrix form?

Best Answer

Since the set of continuous functions over $\mathbb{R}$ is infinite dimensional, it doesn't really make sense to try to create a matrix to determine linear dependence. However, here is a heuristic approach you can try: since equality as functions means equality over every point of the domain, you can try to create a matrix of values of your functions and see if the columns are linearly independent. If the columns are linearly independent, then there is no nontrivial linear combination of your functions that yields the zero function, and your functions are linearly independent. If the columns are not linearly independent, it's not guaranteed that your functions are linearly dependent, although it does put some restrictions on what linear combinations of your functions could yield the zero function.

For example, let's consider your functions $f_1(x) = 2\sin^2 x$, $f_2(x) = 3\cos^2 x$, and $f_3(x) = \cos(2x)$. Let's evaluate them at $x_1 = 0$, $x_2 = \pi/4$, and $x_3 = \pi/2$ (for ease of computation). Let $A = (a_{ij})$ be the matrix where $a_{ij} = f_j(x_i)$. Then $$ A = \left(\begin{matrix} 0 & 3 & 1 \\ 1 & \frac{3}{2} & 0 \\ 2 & 0 & -1\end{matrix}\right) $$ Now, if your functions $\{f_1,f_2,f_3\}$ were linearly independent, then there would exist $a_1,a_2,a_3\in\mathbb{R}$ not all zero such that $a_1f_1(x)+a_2f_2(x)+a_3f_3(x)=0$ for all $x$. In particular, it would be true for our choices of $x_i$, which leads to the equation $$ \left(\begin{matrix} 0 & 3 & 1 \\ 1 & \frac{3}{2} & 0 \\ 2 & 0 & -1\end{matrix}\right)\left(\begin{matrix} a_1 \\ a_2 \\ a_3 \end{matrix}\right) = 0 $$ Now, if it happened that $\det A\ne 0$, then the above equation has no nontrivial solutions, and we could then conclude that your set was linearly independent. However, you can check that indeed $\det A = 0$, and that the null space of $A$ is a one-dimensional space spanned by $\left(\begin{matrix} 3 \\ -2 \\ 6 \end{matrix}\right)$.

So that tells us that if $a_1f_1(x)+a_2f_2(x)+a_3f_3(x) = 0$ for all $x$, then $\left(\begin{matrix} a_1 \\ a_2 \\ a_3 \end{matrix}\right)$ would have to be a multiple of $\left(\begin{matrix} 3 \\ -2 \\ 6 \end{matrix}\right)$. It doesn't automatically tell us that such values will definitely work, since your domain has more than three points. You will have to verify that separately, as the other answers have done. But it does give you a way to find the linear combinations that work.

tl;dr: You can plug in the values of functions into a matrix to test for linear independence. If the matrix is invertible, you know the set is linearly independent. If not, you know which linear combinations are possible, and you can just test those combinations directly.


EDIT: This is a response to @dc3rd's question:

Keep in mind that ultimately you want to determine the linear independence of a subset whose elements are functions. The space of functions isn't like Euclidean space--it's more abstract, where the "vectors" aren't entries of real numbers. The linear independence of a subset of functions doesn't depend on the "test point(s)" chosen--choosing different test points can't possibly affect the linear independence of the subset of functions. What you could do is to take a number, plug it into your set of functions, and see if the resulting set of numbers is linearly independent, but that's kind of silly because $\mathbb{R}$ is one-dimensional, so having more than one function automatically results in linear dependence. You could also take a (finite) set of numbers, and for each function create a vector whose entries are the function evaluated at your set of numbers, and see if the resulting set of vectors is linearly independent. That's a bit less silly, and it's what I'm advocating in this answer. But hopefully you see that doing this is not the same thing as evaluating the linear independence of your functions, although it helps.

Sorry for the long spiel, but to answer your questions:

1) If, for a given $x$ (actually a set of $x$'s; hopefully you see why) you find that all coefficients work, then all that means is you haven't really found useful information to conclude either linear independence or dependence. (BTW, the only way that happens is if all of your entries end up being zero.)

2) The linear independence of your set of functions don't depend on the points you choose to test. However, the linear independence of the vectors (of numbers) that you create by evaluating your functions at your choice of numbers may indeed depend on the points you choose. For example, consider the set $\{\sin x, \cos x, \sin x\cos x\}$. See what happens when you test them at $\{0,\pi/4,\pi/2\}$ and what happens at $\{0,2\pi,4\pi\}$.