Compute efficiently and robustly if a given set of vectors is linearly independent

gram-schmidtindependencenumerical linear algebrasvd

Say we're given a set of $d$ vectors $S=\{\mathbf{v}_1,\dots,\mathbf{v}_d\}$ in $\mathbb{R}^n$, with $d\leq n$ (obviously). We want to test in an efficient way if S is linearly independent. Now, write the coefficient matrix $\mathbf{A}=[\mathbf{v}_1 \dots\mathbf{v}_d]$ (the $\mathbf{v}_i$ are considered to be column vectors).

A non-efficient way would be to compute all minors of rank $d$ of $\mathbf{A}$, and check that they are non-zero (up to some tolerance, as always when we do numerical linear algebra). Another way would be using Gram–Schmidt orthogonalization, but I recall that Gram–Schmidt orthogonalization is numerically unstable. Which is the correct alternative? Singular Value Decomposition (if all singular values are strictly positive, then the vectors are independent)? QR factorization?

Best Answer

Let $A$ be not the $n\times d$-matrix, but the $n\times n$-matrix $$A=[v_1 \ldots v_d\ \vec{0}\ldots\ \vec 0]^t.$$ Reduce $A$ to Smith Normal Form. This takes $O(n\cdot d)$ row operations. The number of non-zero rows of the resulting matrix equals the number of these vectors that were linearly independent.