Suppose I have a $4 \times 4$ matrix $A$ whose columns represent vectors $v_1,v_2,v_3,v_4$ in $\mathbb{R}^4$. Now, given that $\det{A} = 0$ (i.e. the vectors are linearly dependent), I want to make sure that any three vectors out of the given 4, are linearly independent. What is the most efficient way of doing this? I can only think of checking linear independence of each three vectors out of all the possible combinations. But I feel that there must be some easier way to accomplish this.
[Math] Efficient way of checking linear independence
linear algebra
Related Solutions
It is true that two vectors are dependent if they "point in the same (or opposite) direction", i.e. if they are aligned.
But that is not totally true for three vectors in $3$D or more.
In the sense that, when the three vectors are aligned, i.e. parallel, i.e. when they are scalar multiples of each other, they are for sure dependent.
But the definition of linear dependency of three vectors is wider than being parallel: it includes also the case in which they are co-planar, although not parallel.
If you want to see that geometrically, taking the three vectors as position vectors from the origin, if they define a full $3$D parallelepiped then they are independent, if instead the parallelepiped collapses into a flat figure or segment then the vectors are dependent.
Algebraically this translates into the fact whether the matrix formed by the three vectors has full rank ($3$) or less.
Similarly for $n$ vectors of $m$ dimensions.
Then from the theory of linear system you know that, in a homogeneous system, if the matrix has full rank then it has the only solution $(0,0, \cdots, 0)$ which corresponds to the combination coefficients to be all null.
In reply to your comment, in ${\mathbb R}^2$ if you have two non-aligned = independent vectors, then a third one will lie on their same plane (the $x,y$ plane).
In the geometric interpretation, the parallelepiped (the hull) will be flat, i.e. dimension 2, which is less than 3, the number of vectors.
In the algebraic interpretation, a matrix $3 \times 2$ cannot have a rank greater than two: so 3 (or more) 2D vectors are necessarily dependent.
final note (to clarify what might be the source of your confusion)
The (in)dependence of $n$ vectors in ${\mathbb R}^m$ is defined for the whole set of $n$ vectors: they might be dependent, notwithstanding that a few of them ($q<n, \; q\le m$) could be independent. Yet if one is dependent on another (or other two, etc.), then the whole set is dependent.
And in fact it is a common task, given $n$ vectors, to find which among them represent an independent subset: the minor in the matrix with non-null determinant, the larger giving the rank.
$v_1=(1,0,0),v_2=(0,1,0), v_3=(1,1,0)$ is a simple counter-example in $\mathbb R^{3}$. For $\mathbb R^{n}$ you can make an obvious modification
Best Answer
In order to check whether any three columns are linearly independent, you would unfortunately have to examine all subsets of $3$ columns.
As a sidenote, this question is related to computing the spark of a matrix (see here: http://en.wikipedia.org/wiki/Spark_%28mathematics%29). If all sets of $3$ columns have full rank (rank equal to $3$), then the spark of $A$ in your case is equal to $4$: $4$ is the smallest number of columns that are linearly dependent. But, if there exists a subset of $3$ columns that is linearly dependent, then the spark is at most $3$. Computing the spark is an NP-hard problem.