[Math] Question concerning linear combinations of vectors and linear independence in Linear Algebra.

abstract-algebralinear algebravector-spaces

My question concerns the definition of linear combinations and a criteria for linear independence of a Set (either finite or infinite).

Here is the following Definition and Criteria given:

A vector $v$ in vector space $V$ is a linear combination of vectors of set $S$ if there is a a finite number of vectors $x_1, \cdots , x_n$ and scalars $a_1, \cdots , a_n$ such that
$$v= a_1x_1 + \cdots + a_nx_n$$

Question:

Here is finite restricted to more than $1$ vector. Or does this definition include the possibility of a vector $v$ being a linear combination of no vectors?

Also, can these finite vectors in set S be the same. That is must they all be distinct?

Criteria for Linear Independence:

My book writes the following fact:

A set is linearly independent if and only if the only representations of the zero vector as linear combinations of its vectors are trivial representations.

Question:

Taking in mind the definition of linear dependence requires a nontrivial representation of the zero vector as a linear combination of DISTINCT vectors of the set examined for linear dependence, why does this criteria for linear Independence say "as linear combinations of its vectors" as opposed to "as linear combinations of its distinct vectors" ?

Lastly, if a set is linear independent, does that imply the set contains distinct vectors?

Does it make sense to talk about linear dependence and linear independence in the context of a set of vectors with some vectors repeated. That is with a set of vectors with element all not distinct?

Thanks in Advance.

Best Answer

Here are answers to your questions:-

  1. Firstly, when you say scalars, $a_1, a_2, \cdots, a_n$, they are real numbers and hence can also be $0$. Keeping this in mind, suppose there is a set $S = \left\lbrace v_1, v_2, \cdots, v_n \right\rbrace \subseteq V$. Then, quite obviously, the vector $\textbf{0} \in V$ cab ve written as $$0 \cdot v_1 + 0 \cdot v_2 + \cdots + 0 \cdot v_n = \textbf{0}$$ This is what we call the "trivial linear combination".

In fact, what confusion you have in mind is that when you say that a vector is a linear combination of other vectors, there must be at least one vector and one scalar with which you can construct your "linear combination".

  1. When you talk about a "set", elements cannot be repeated. So, there is no point of asking if the elements of the set are distinct.

Lastly, I do not know what book you are following, but I feel that a better version of definitions of linear dependence and independence is the following:-

Linear Independence

A finite set $S = \left\lbrace v_1, v_2, \cdots, v_n \right\rbrace \subseteq V$ is said to be linearly independent iff

$$\alpha_1 \cdot v_1 + \alpha_2 \cdot v_2 + \cdots + \alpha_n \cdot v_n = \textbf{0}$$

implies that $\alpha_1 = \alpha_2 = \cdots = \alpha_n = 0$. This actually means that the only way you can obtain the zero vector $\textbf{0}$ from a linearly "independent" set is by setting the scalars (coefficients) to be $0$, which we call the "trivial" combination.

In case of an infinite set $S \subseteq V$, it is said to be linearly independent iff every finite subset of $S$ is linearly independent. We have definition of linear independence of finite sets which can be used.

Linear Dependence

A finite set $S = \left\lbrace v_1, v_2, \cdots, v_n \right\rbrace \subseteq V$ is said to be linearly "dependent" iff it is not linearly independent. Thus, we need to negate the statement for linear independence. The negation of the statement

"$\exists \alpha_1, \alpha_2, \cdots, \alpha_n \in \mathbb{R}$ and $i \in \left\lbrace 1, 2, \cdots, n \right\rbrace$ such that $\alpha_1 \cdot v_1 + \alpha_2 \cdot v_2 + \cdots + \alpha_n \cdot v_n = \textbf{0}$ and $\alpha_i \neq 0$"

This statement means that the vector $v_i \in S$ can be actually written as a linear combination of the other vectors. In particular,

$$v_i = \left( - \dfrac{\alpha_1}{\alpha_i} \right) \cdot v_1 + \left( - \dfrac{\alpha_2}{\alpha_i} \right) \cdot v_2 + \cdots + \left( - \dfrac{\alpha_{i - 1}}{\alpha_i} \right) \cdot v_{i - 1} + \left( - \dfrac{\alpha_{i + 1}}{\alpha_i} \right) \cdot v_{i + 1} + \cdots + \left( - \dfrac{\alpha_n}{\alpha_i} \right) \cdot v_n$$

and therefore the vector $v_i \in S$ is "dependent" on the other vectors.

In fact, the linear combination $\alpha_1 \cdot v_1 + \alpha_2 \cdot v_2 + \cdots + \alpha_n \cdot v_n = \textbf{0}$ is called the "non - trivial" linear combination.

For an infinite set $S \subseteq V$, it is said to be linearly dependent iff it is not linearly independent. Again, we need to negate the statement for linear independence of infinite set. The negation of the statement would be

"There exists a finite set $A \subset S$ such that $A$ is not linearly independent". And now, we do have the definition of linear dependence (not linear independence) for finite sets which can be used.

I hope your confusion about distinct elements will be cleared by this. And if you are still confused, try forming sets which are linearly dependent and independent in $\mathbb{R}^2$ and $\mathbb{R}^3$ which you can easily visualize. Also read some material on span of a set and how we can connect linear combination and span with linear dependence and independence.