Review of Example of Linear Independence in *Introduction to Laplace Transforms and Fourier Series, Second Edition*, by Phil Dyke

linear algebravector-spaces

Appendix C (Linear Spaces Review) of Introduction to Laplace Transforms and Fourier Series, Second Edition, by Phil Dyke, states the following:

enter image description here

enter image description here

Based on my previous study of linear algebra, I cannot see how this is linearly dependent, as the author suggests. After all, there are no scalars that you could multiply the elements of $S$ by to get rid the vector $x^2$. Therefore, we would require the trivial solution in order to have $y = 0$ — that is, all of the scalars are $0$. Am I misunderstanding something here?

Furthermore, the authors choice of constants for the $\alpha_i$ do not even work, which makes me wonder what is going on here?

I would greatly appreciate it if people could please take the time to clarify this.

Note to Self:

We're dealing with the vector space of all quadratic polynomials, so linear independence/dependence can be determined using matrices and elementary row operations, just as is done with scalar vectors.

The first element of $S$ is the vector $1 \times 1 + 0 \times x + 0 \times x^2$, the second element of $S$ is the vector $0 \times 1 + 1 \times x + 0 \times x^2$, and so on…

Best Answer

What the author says is that$$1\times1+\frac12\times x+\left(-\frac12\right)\times(2+x)+0\times x^2=0,$$which is true. Furthermore, the numbers $1$, $\frac12$, $-\frac12$, and $0$ aren't all equal to $0$. Therefore, by the definition of linear independence, the set $\{1,x,2+x,x^2\}$ is linear dependent.