[Math] Show that $Ax=0, Bx=0$ share the same solution space iff there is some invertible $P$ s.t. $B=PA$.

linear algebramatrices

The question is said in the title, suppose $A,B\in M_{m\times n}(K)$, where $K$ is some infinite number field.

If we regard $A,B$ as linear maps from $K^n$ to $K^m$, then they share the same solution space $\Leftrightarrow$ $Ker A=Ker B$. And since kernels are orthogonally complementary to row spaces, it is equivalent to $R(A)=R(B)$ where $R$ denotes the row space of a matrix. Therefore the row vector groups of $A,B$ are equivalent, which means every row of $A$ can be expressed as a linear combination of those of $B$, and vice versa. But since $A,B$ may not have full row rank, I'm having trouble showing such a invertible $P$ exists.

I believe I'm on the right track but may miss something. Can you help me? Best regards!

EDIT: Thanks go to @DavidP who provided a very good argument using RREF, which is by far slicker than any other approach I could imagine. But I am still looking for a way to complete my proof, which I think is going to be an ok one.

So the real question is, if two row vector groups
$$A=\begin{bmatrix} \alpha_1\\ \vdots\\ \alpha_m\end{bmatrix},\quad B=\begin{bmatrix} \beta_1\\ \vdots\\ \beta_m\end{bmatrix}$$
are equivalent, is it true that there exists some invertible $P\in M_m(K)$ such that
$$B=PA$$?

Best Answer

The field is irrelevant. Let $\{Bx_1,\ldots,Bx_k\}$ be a basis of the column spaces of $B$. Since $A, B$ have the same null space, every nontrivial linear combination of $\{Ax_1,\ldots,Ax_k\}$ must be a nonzero vector. In other words, $\{Ax_1,\ldots,Ax_k\}$ is a linearly independent set. Therefore there exists a bijective linear transformation $P$ (defined on $K^m$) that maps each $Ax_j$ to $Bx_j$. As $A$ and $B$ have identical null spaces, it follows that $PA=B$.