I have thought about your problem for a while now and I think there is a nice slick way to do this. Consider the space $W$ of all multilinear alternating forms $f$ in $k$ - variables
$$f : V \times \ldots \times V \to \Bbb{C}.$$
We claim that there is a canonical isomorphism between $W$ and $(\bigwedge^k V)^\ast$. Indeed, this should be clear because given any $f \in W$, the universal property of the $k$ - th exterior power tells us that there is a unique linear map $g \in (\bigwedge^k V)^\ast$ such that $f = g \circ \iota$ where $\iota : V \times \ldots \times V \longrightarrow \bigwedge^k V$ is the canonical mapping that sends the tuple $(v_1,\ldots,v_k)$ to $v_1 \wedge \ldots \wedge v_k$. Conversely given any $h \in (\bigwedge^k V)^\ast$ we can precompose it with $\iota$ to give us a mapping from $V \times \ldots \times V \to \Bbb{C}$.
In summary, we can use these facts to give us a canonical isomorphism between $W$ and $(\bigwedge^k V)^\ast$. If we put $k = n$, where $n = \dim V$ then
$$ 1= \dim_{\Bbb{C}} \bigwedge\nolimits^{\!k}V = \dim_{\Bbb{C}} \left(\bigwedge\nolimits^{\!k} V\right)^\ast $$
from which it follows that $W$ is one dimensional. In other words, any $f \in W$ is a scalar multiple of $\det$, where
$$\det : V \times V\times \ldots \times V \longrightarrow \Bbb{C}$$
is the mapping that sends the tuple $(v_1,\ldots, v_n)$ to the determinant of the matrix whose columns are the vectors $v_1, v_2, \ldots, v_n$. Now here comes the killer blow: Suppose we demand that an alternating multilinear $f$ be such that $f(e_1,\ldots,e_n) = 1$ where the $e_i$ are the standard basis vectors of $\Bbb{C}^n$. Then because
$$f(v_1,\ldots,v_n) = c\cdot \det(v_1,\ldots,v_n)$$
for some constant $c$, shoving in $(v_1,\ldots,v_n) = (e_1,\ldots,e_n)$ we must have that
$$\begin{eqnarray*} 1 &=& f(e_1,\ldots, e_n) \\
&=& c\cdot \det(e_1,\ldots,e_n) \\
&=& c \end{eqnarray*}$$
because the determinant of the identity matrix is $1$. Consequently we have shown:
Any alternating multilinear form in $\dim V$ number of variables with the value of the form on the tuple $(e_1,\ldots,e_n)$ being $1$ must be equal to the determinant.
$$\hspace{6in} \square$$
Here a simple geometric explanation:
- $2$ vectors in the plane are linearly independent if and only if they span a parallelogram with a non-zero area
- $3$ vectors in 3D-space are linearly independent if and only if they span a parallelepiped with a non-zero volume
- $n$ vectors in $\mathbb{R}^n$ are linearly independent if and only if they span an $n$-dimensional parallelepiped with a non-zero volume
The determinant is a so-called "volume form" that gives for $n$ vectors in $\mathbb{R}^n$ the $n$-dimensional volume of the parallelepiped that is spanned by those vectors (up to a sign which gives rise to the so called orientation).
So, if $n$ vectors in $\mathbb{R}^n$ are linearly dependent, they cannot span an $n$-dimensional parallelepiped and hence produce a volume of zero.
Best Answer
If $r_1, \ldots r_n$ are the rows of the matrix and $r_i = sa+tb$, where $s,t$ are scalars and $a,b$ are row vectors, then you have
$$\det\begin{pmatrix}r_1 \\ \vdots \\r_i \\ \vdots \\ r_n\end{pmatrix} = \det\begin{pmatrix}r_1 \\ \vdots \\ sa+tb \\ \vdots \\ r_n\end{pmatrix} = s\det\begin{pmatrix}r_1 \\ \vdots \\ a \\ \vdots \\ r_n\end{pmatrix} + t\det\begin{pmatrix}r_1 \\ \vdots \\ b \\ \vdots \\ r_n\end{pmatrix}$$
This holds for any row $i=1,\ldots , n$. And similarly this also applies to columns.