[Math] Show that a matrix has full column rank

linear algebramatricesmatrix-rank

Consider a matrix $A$ of size $T\times 2K$. Consider the collection of numbers $k_1,k_2,…,k_T$, where each $k_t\in \{1,…,K\}$.

Each $t$-th row of the matrix $A$ is structured as follows:

  • its $k_t$-th element is equal to $1$

  • its $(k_t+K)$-th element is equal to some scalar $\delta_t>0$

  • all other elements are zero.

Further, $A$ does not contain zero columns.

I believe that the following holds:

$A$ has full column rank $2K$ if and only if there exists $t\in \{1,…,T\}$ and $\tau\in \{1,…,T\}$ with $t\neq \tau$ such that
$$
k_t=k_{\tau} \quad \delta_t\neq \delta_{\tau}
$$

Question: I wrote a proof for this claim (below), which looks to me a bit "naive". Is there anything more formal and, perhaps, simpler that you can suggest?


My proof:

$A$ has full column rank $2K$ if and only if the following system admits as unique solution $x=0_{2K\times 1}$:
$$
(*)\hspace{1cm} x_{k_t} + \delta_{t} x_{k_t+K}=0 \quad \text{ for each $t\in \{1,…,T\}$}.
$$

First, I prove sufficiency.
If there exist $t,\tau$ such that $k_t=k_{\tau}\equiv k$ with $\delta_{t}\neq\delta_{\tau}$, then system $(*)$ contains the equations
$$
\begin{aligned}
& x_{k}+\delta_t x_{k+K}=0,\\
& x_{k}+\delta_{\tau} x_{k+K}=0,\\
\end{aligned}
$$

from which it follows that $x_{k}=x_{k+K}=0$. If the above is true for every $k\in \{1,…,K\}$, then $x_{k}=x_{k+K}=0$ for each $k\in \{1,…,K\}$. Hence, the unique solution of system $(*)$ is $x=0_{2K\times 1}$.

Now, I prove necessity.
Suppose there exist only a $t$ such that $k_t\equiv k$. Then, system $(*)$ contains only one equation involving any of $x_k,x_{k+K}$, which is
$$
x_{k}+\delta_t x_{k+K}=0,
$$

which is satisfied for any values of $x_k, x_{k+K}$ such that $x_k=-\delta_t x_{k+K}$. Therefore, system $(*)$ has not a unique solution.

Best Answer

Full column rank is equivalent to the column vectors being linearly independent.

After reordering the rows if necessary (which does not change the rank) your matrix can be written in this form (where the $*$'s are positive numbers): $$\begin{pmatrix} 1 & 0 & 0 & \cdots & 0 & * & 0 & 0 & \cdots & 0 \\ \vdots & \vdots & \vdots & & \vdots & \vdots & \vdots & \vdots & & \vdots \\ 1 & 0 & 0 & \cdots & 0 & * & 0 & 0 & \cdots & 0 \\ 0 & 1 & 0 & \cdots & 0 & 0 & * & 0 & \cdots & 0 \\ \vdots & \vdots & \vdots & & \vdots & \vdots & \vdots & \vdots & & \vdots \\ 0 & 1 & 0 & \cdots & 0 & 0 & * & 0 & \cdots & 0 \\ 0 & 0 & 1 & \cdots & 0 & 0 & 0 & * & \cdots & 0 \\ \vdots & \vdots & \vdots & & \vdots & \vdots & \vdots & \vdots & & \vdots \\ 0 & 0 & 1 & \cdots & 0 & 0 & 0 & * & \cdots & 0 \\ \vdots & \vdots & \vdots & \ddots & \vdots & \vdots & \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & 0 & \cdots & 1 & 0 & 0 & 0 & \cdots & * \\ \vdots & \vdots & \vdots & & \vdots & \vdots & \vdots & \vdots & & \vdots \\ 0 & 0 & 0 & \cdots & 1 & 0 & 0 & 0 & \cdots & * \\ \end{pmatrix}$$

or, leaving out the zeros, $$\begin{pmatrix} 1 & & & & & * & & & & \\ \vdots& & & & & \vdots & & & & \\ 1 & & & & & * & & & & \\ & 1 & & & & & * & & & \\ & \vdots & & & & & \vdots & & & \\ & 1 & & & & & * & & & \\ & & 1 & & & & & * & & \\ & & \vdots & & & & & \vdots & & \\ & & 1 & & & & & * & & \\ & & & \ddots& & & & & \ddots & \\ & & & & 1 & & & & & * \\ & & & & \vdots & & & & & \vdots \\ & & & & 1 & & & & & * \\ \end{pmatrix}$$

Now it's clear that the column vectors can't be linearly independent if all the $*$'s for some "1"-position are the same (because the $k+K$'th column vector would then be a multiple of the $k$'th one, for some $k$). On the other hand having not all $*$'s equal in any column (and having at least two rows at every "1"-position) is also clearly sufficient for having linear independence.