Negation of a Statement when Logical Equivalences are Involved

linear algebralogicpropositional-calculus

How do we negate a statement/definition when logical equivalences are involved? For example, how do we negate

$\textbf{Definition}.$

Let $V$ be a vector space, and let $\textbf{v}_1,\dots,\textbf{v}_n \in V$. Let $\alpha_1,\dots,\alpha_n$ be scalars. Let $\textbf{0}$ be the zero element of $V$.

$\textbf{v}_1,\dots,\textbf{v}_n$ are said to be linearly independent if $$\alpha_1 \textbf{v}_1 + \dots + \alpha_n \textbf{v}_n = \textbf{0} \Leftrightarrow \alpha_1,\dots,\alpha_n = 0$$

Using this definition, how would one go about defining linear dependence by negating it?

The difficulty for me arises since $a \leftrightarrow b$ is not the same as $a \Leftrightarrow b$.

$a \Leftrightarrow b$ means $a \leftrightarrow b$ is a tautology, i.e $a$ and $b$ have the same truth value, so the bi-implication is always true.

Best Answer

First off, be careful with assuming a precise difference between $\Leftrightarrow$ and $\leftrightarrow$. While some authors will maintain a particular technical distinction between them, many will not -- and those who do distinguish may do so in a different way than you expect.

Second, the definition you quote is badly written. In particular, the "Let $\alpha_1,\ldots,\alpha_n$ be scalars" in the beginning should not be there -- saying so at that point gives the impression that you need to choose particular $\alpha_1,\ldots,\alpha_n$ before you can say whether the vectors are independent (and that the answer to that could depend on which scalars you choose).

Of course what is meant is really:

Let $V$ be a vector space, and let $\textbf{v}_1,\dots,\textbf{v}_n \in V$. Let $\textbf{0}$ be the zero element of $V$.

$\textbf{v}_1,\dots,\textbf{v}_n$ are said to be linearly independent if $$\forall \alpha_1,\ldots,\alpha_n: \bigl[ \alpha_1 \textbf{v}_1 + \dots + \alpha_n \textbf{v}_n = \textbf{0} \Leftrightarrow \alpha_1,\dots,\alpha_n = 0 \bigr]$$

The $\alpha_i$s are quantified inside the defining condition for "independent". This may be what you say you're understanding implicitly by the use of $\Leftrightarrow$ rather than $\leftrightarrow$, but making the quantification explicit is important when negating the definition.

The negated property would be $$ \neg\forall \alpha_1,\ldots,\alpha_n: \bigl[ \alpha_1 \textbf{v}_1 + \dots + \alpha_n \textbf{v}_n = \textbf{0} \Leftrightarrow \alpha_1,\dots,\alpha_n = 0 \bigr]$$ which is the same as $$ \exists \alpha_1,\ldots,\alpha_n: \bigl[ \alpha_1 \textbf{v}_1 + \dots + \alpha_n \textbf{v}_n = \textbf{0} \not\Leftrightarrow \alpha_1,\dots,\alpha_n = 0 \bigr]$$

So we should be looking for a choice of scalars such that the two sides have different truth values. Since we're talking about a vector space, if the scalars are all zero, then the combination of the left will also be, so the only way for the truth values to be different is the vector sum is zero but the scalars are not.

So in the presence of the vector space axioms, the negated condition is equivalent to $$ \exists \alpha_1,\ldots,\alpha_n: \bigl[ \alpha_1 \textbf{v}_1 + \dots + \alpha_n \textbf{v}_n = \textbf{0} \land (\alpha_1\ne 0\lor \cdots\lor \alpha_n \ne 0) \bigr]$$