[Math] Linear Independence

linear algebra

I've come across a question in Linear Algebra that I can't quite figure out. I've tried a multitude of things that either don't work or aren't sufficient enough to convince me I understand linear independence well enough.

I know a set of vectors, S, in vector space V are linearly independent if their linear combination, that is,

$\lambda_1 \mathbf{v}_1 + … + \lambda_n \mathbf{v}_n = \mathbf{0}$

means all scalars are equal to each other and 0,

$\lambda_1 = … = \lambda_n = 0.$

I can also show a set of vectors S is linearly independent if I'm given a set of vectors with numerical values – by creating a matrix and reducing it to row echelon form. However, my understanding isn't great enough that I can expand on this and answer questions such as the following:

Assume the vectors u, v and w are linearly independent elements of a vector space V.
For each of the following sets decide whether it is linearly independent.

A. {u + v + w, v – 2w, 2u + 3w}

B. {u + 2w, v + 2w, 2w}

C. {x, y, z}

where,

x = u + 2vw,

y = 2x + u + 2vw,

z = 3x – 2y.

If anyone can explain to me the connection between this type of question and the definition of linear independence by answering A or providing a guideline of how to answer A then hopefully I can tackle B and C and any related questions. Thanks.

Best Answer

You basically need to write the definition of linear independence. Suppose that $$ \alpha(u+v+w)+\beta(v-2w)+\gamma(2u+3w)=0. $$ We can rewrite this as $$ (\alpha+2\gamma)\,u+(\alpha+\beta)\,v+(\alpha-2\beta+3\gamma)\,w=0. $$ Since $u,v,w$ are linearly independent, we get the equalities $$ \alpha+2\gamma=0,\ \ \alpha+\beta=0,\ \ \alpha-2\beta+3\gamma=0. $$ Now you can analyze this system. If the only solution is $\alpha=\beta=\gamma=0$, you'll know that the three vectors in A are linearly independent. If you produce a nonzero solution, you'll know that they are linearly dependent.

Related Question