Negation of the Definition of Linear Independence

linear algebrapropositional-calculus

$\textbf{Definition}.$

Let $V$ be a vector space, and let $\textbf{v}_1,\dots,\textbf{v}_n \in V$. Let $\alpha_1,\dots,\alpha_n$ be scalars. Let $\textbf{0}$ be the zero element of $V$.

$\textbf{v}_1,\dots,\textbf{v}_n$ are said to be linearly independent if $$\alpha_1 \textbf{v}_1 + \dots + \alpha_n \textbf{v}_n = \textbf{0} \Leftrightarrow \alpha_1,\dots,\alpha_n = 0$$

Firstly, is this a valid definition of linear independence?

Secondly, how do I find the negation of this definition of linear independence? I would expect to get something like there exist scalars $\alpha_1,\dots,\alpha_n$, not all zero, such that $\alpha_1 \textbf{v}_1 + \dots + \alpha_n \textbf{v}_n = \textbf{0}$, but I am not sure how I would arrive at something like this.

Best Answer

Your definition of linear dependence is valid, although we usually only write $\implies$ since the left-hand arrow is trivial. Of course, to say we can deduce all $\alpha_i$ is equivalent to saying there does not exist any other choice of the $\alpha_i$ that works. Therefore, the negation is as expected to say that one does.

Related Question