If the matrix $A$ has rank $k$, then it has $k$ linearly independent lines. Those form an $k\times n$ submatrix, which of course also has rank $k$. But if it has rank $k$, then it has $k$ linearly independent columns. Those form a $k\times k$ submatrix of $A$, which of course also has rank $k$. But a $k\times k$ submatrix with rank $k$ is a full-rank square matrix, therefore invertible, thus is has a non-zero determinant. And therefore the determinant rank has to be at least $k$.
Incomplete thoughts:
Thinking of the elements as written in an $n \times m$ grid, send it to the element
$\alpha = \bigotimes_{j=1}^m (v_{1j} \wedge \cdots \wedge v_{nj}) \otimes \bigotimes_{i=1}^n (w_{i1} \wedge \cdots \wedge w_{im})$.
(so, putting the $v$'s together in columns and the $w$'s in rows.)
To see that this is well-defined, it suffices to show that it is zero if two adjacent terms in the long wedge product are equal.
For instance if $v_{11} \otimes w_{11} = v_{12} \otimes w_{12}$, then either both are $0$, in which case we're done, or $v_{11} = \lambda v_{12}$, $w_{11} = \frac{1}{\lambda} w_{12}$ for some scalar $\lambda \ne 0$. Then the second equality forces $\alpha = 0$ by the $i=1$ factor on the $W$ part. This covers all the cases where we exchange two elements in the same row. (It also covers cases where we exchange elements in the same column. So maybe it is already sufficient.)
But there are still the "line break" equalities to consider, and I'm not sure how to complete the argument, sorry. (This feels very reminiscent of Fulton's proof of Sylvester's Lemma in his Young Tableaux book, with a clever recursive argument for this last case.)
Edit: Here's a thought. Rather than using "line break" inequalities, we'll go through the entries of the grid in a back-and-forth order. So, first we consider the equalities along the first row. Then, we consider the equality
$$v_{1m} \otimes w_{1m} = v_{2m} \otimes w_{2m},$$
comparing the "last entries in the first two rows". It's clear that $\alpha = 0$ in this case since it is a "column equality" (we use the $j=m$ factor -- the last one -- in the $V$ part). Then we work backwards along row 2, then forward along row 3, and so on. Thus at every step, we are either using a "row equality" or a "column equality" to conclude that $\alpha = 0$, so at the end, we conclude that the expression is alternating in all $n\cdot m$ wedges.
Best Answer
Hint: Suppose that $PAP^{-1}$ is upper triangular, as is $QBQ^{-1}$. Then $$ (P \otimes Q)(A \otimes B)(P \otimes Q)^{-1} = (PAP^{-1}) \otimes (QBQ^{-1}) $$ is a tensor product of upper-triangular matrices. Note that this product is itself upper-triangular.