I'm curious. Can ANY matrix transformation make some matrix with its columns linearly independent, or with an empty kernel, linearly independent? For example, if A is a linearly dependent matrix, and B any matrix, could BA ever come out to be linearly independent?
[Math] Can a matrix transformation ever make a linearly dependent matrix linearly independent
linear algebramatrices
Related Solutions
Given $A\in\mathbb{R}^{m\times n}$, $m\geq n$, compute the (economy) QR factorisation. This gives $$ A = QR, \quad R\in\mathbb{R}^{n\times n}. $$ Now if $\mathrm{rank}(A)<n$, the upper triangular matrix $R$ has a staircase profile with some of the "steps" of the staircase over more than one column. Select column indices $j_1,\ldots,j_k$ such that if you remove these columns from $R$, you obtain a nonsingular upper triangular matrix (you can consider it as making each step of the staircase of length 1). The columns $j_1,\ldots,j_k$ can be expressed as linear combination of the remaining columns.
Example: The red columns indicate the columns which are linear combinations of the others.
$$ \begin{bmatrix} \times & \times & \color{red}\times & \times & \color{red}\times & \color{red}\times \\ 0 & \times & \color{red}\times & \times & \color{red}\times & \color{red}\times \\ 0 & 0 & \color{red}0 & \times & \color{red}\times & \color{red}\times \end{bmatrix} $$
Example: For the given matrix from the question, the QR factorisation is:
Q =
0 -0.4472 -0.8944
0 -0.8944 0.4472
-1.0000 0 0
R =
-1.0000 2.0000 -1.0000
0 4.4721 -2.2361
0 0 0
So one can pick the column 2 or 3 to make the matrix $R$ nonsingular and upper triangular (hence either the column 2 or 3 is a linear combination of the others).
Why are the colums of
$\begin{bmatrix}1&0&0&0\\0&1&0&0\\0&0&1&0\end{bmatrix}$ linearly dependent?
Because there exists non-zero $x$ such that
$\begin{bmatrix}1&0&0&0\\0&1&0&0\\0&0&1&0\end{bmatrix} x = 0$
i.e.
$\begin{bmatrix}1&0&0&0\\0&1&0&0\\0&0&1&0\end{bmatrix}\begin{bmatrix} 0\\0\\0\\1\end{bmatrix} = \begin{bmatrix} 0\\0\\0\end{bmatrix}$
How do you prove that any $3\times4$ matrix has linearly dependent columns?
Suppose the columns of your matrix are $\mathbf v_1,\mathbf v_2,\mathbf v_3,\mathbf v_4.$ And suppose that $\mathbf v_1,\mathbf v_2,\mathbf v_3$ are linearly independent. Then we want to show that there exists and $a,b,c$ such that $a\mathbf v_1 + b\mathbf v_2 + c\mathbf v_3 = \mathbf v_4$
How to do that? It might help to show that there exists $a_1,b_1,c_1$ such that:
$a_1\mathbf v_1 + b_1\mathbf v_2 + c_1\mathbf v_3 = \begin{bmatrix} 1\\0\\0\end{bmatrix}$
and similarly there is $a_2,b_2, c_2$ and $a_3, b_3, c_3$ such that
$a_2\mathbf v_1 + b_2\mathbf v_2 + c_2\mathbf v_3 = \begin{bmatrix} 0\\1\\0\end{bmatrix}$ and
$a_3\mathbf v_1 + b_3\mathbf v_2 + c_3\mathbf v_3 = \begin{bmatrix} 0\\0\\1\end{bmatrix}$
And certainly $\mathbf v_4$ can be composed as a combintation of $\begin{bmatrix} 1\\0\\0\end{bmatrix}, \begin{bmatrix} 0\\1\\0\end{bmatrix},\begin{bmatrix} 0\\0\\1\end{bmatrix}$
Best Answer
No it can not: $$ A \in K^{m\times n}, B \in K^{p\times m} $$ $A$ is linear dependent, so there exist $\lambda_k\in K$ not all equal to $0$ with $$ \sum_{k=1}^n \lambda_k a_k = 0. $$ where $A=(a_1,a_2,\ldots,a_n)$ or: there is a non-zero vector $x_0 = (\lambda_k)\ne 0$ with $$ A x_0 = 0. $$ For $BA$ we have $$ (BA)x_0 = B(Ax_0) = B 0 = 0 $$ so regardless of the choice of $B$ the vector $x_0$ is a non-zero kernel vector for $BA$, so $BA$ can not be linear independent.