Here is an easy argument. Let $x$ be the matrix
$$
x=\begin{bmatrix}v_1&v_2&\cdots&v_n\end{bmatrix}.
$$
Then
$$
x^*x=\begin{bmatrix}
v_1^*v_1&v_1^*v_2&\cdots&v_1^*v_n\\
v_2^*v_1&v_2^*v_2&\cdots&v_2^*v_n\\
\vdots & \vdots & \ddots & \vdots \\
v_n^*v_1&v_n^*v_2&\cdots&v_n^*v_n\\
\end{bmatrix}
=\begin{bmatrix}
\langle v_1, v_1 \rangle & \langle v_1, v_2\rangle & \cdots &\langle v_1, v_n \rangle \\
\langle v_2, v_1 \rangle & \langle v_2, v_2\rangle & \cdots &\langle v_2, v_n \rangle \\
\vdots & \vdots & \ddots & \vdots \\
\langle v_n, v_1 \rangle & \langle v_n, v_2\rangle & \cdots &\langle v_n, v_n \rangle
\end{bmatrix}.
$$
As $x^*x$ is positive-semidefinite, $\det x^*x\geq0$.
If $v_1,\ldots,v_n$ are linearly dependent, there exist coefficients, not all zero, with $c_1v_1+\cdots+c_nv_n=0$. We can write this as $xc^*=0$ with $c\ne0$. But then $x^*xc=0$, and so $x^*x$ has a kernel, and $\det x^*x=0$.
Conversely, if $\det x^*x=0$ it means that there exists nonzero $c$ with $x^*xc^*=0$. But then $(xc^*)^*xc^*=cx^*xc^*=0$, so $xc^*=0$ and $v_1,\ldots,v_n$ are linearly dependent.
I've since found a better solution to the problem:
Proof: $(\implies)$ Assume the two sets of vectors span the same $k$-dimensional subspace of $(\mathbb{R}^n)^*$. Then we may write each of the $f_i$ as a linear combination of the $g_j$ in a unique manner, since $\{g_1, g_2, ..., g_k\}$ is a basis for span$(g_1, g_2, ..., g_k)$. Write
\begin{eqnarray*}
f_i = \sum_j c_{ij}g_j & & \textrm{for} \ 1 \leq j \leq k.
\end{eqnarray*}
Observe that the $k \times k$ matrix $(c_{ij})$ is a change-of-basis matrix representing the basis $\beta_1 = \{f_1, f_2, \dots, f_k\}$ in terms of the second basis $\beta_2 = \{g_1, g_2, \dots, g_k\}$ for the space span$(g_1, g_2, \dots, g_k)$ = span$(f_1, f_2, \dots, f_k)$.
We now have
\begin{eqnarray*}
f_1 \wedge f_2 \wedge \cdots \wedge f_k &=&
\left(\sum_j c_{1j}g_j\right) \wedge \left(\sum_j c_{2j}g_j\right) \wedge \cdots \wedge \left(\sum_j c_{kj}g_j\right) \\
&=& \textrm{det}(c_{ij})g_1 \wedge g_2 \wedge \cdots \wedge g_k
\end{eqnarray*}
$(\impliedby)$ Assume now that $f_1 \wedge f_2 \wedge \cdots \wedge f_k = c\ \cdot\ g_1 \wedge g_2 \wedge \cdots \wedge g_k$. By way of contradiction, let us assume additionally that span$(g_1, g_2, \dots, g_k) \neq$ span$(f_1, f_2, \dots, f_k)$. Then we may find a one-form $h \in$ span$(g_1, g_2, \dots, g_k) -$span$(f_1, f_2, \dots, f_k)$ such that $(f_1, f_2, \dots, f_k, h)$ is a linearly independent set, and so it follows that $f_1 \wedge f_2 \wedge \cdots \wedge f_k \wedge h \neq 0$. This leads to a contradiction:
\begin{eqnarray*}
(f_1 \wedge f_2 \wedge \cdots \wedge f_k) \wedge h &=&
c \cdot (g_1 \wedge g_2 \wedge \cdots \wedge g_k) \wedge h \\
&=& 0
\end{eqnarray*}
Because $\{g_1, g_2, \dots, g_k, h\}$ is a linearly dependent set. $\square$
Best Answer
Let $v_i=\sum_{k=1}^n a_{ki}e_k$, where $\{e_1,\dots,e_n\}$ is an orthonormal basis. Then $$ \langle v_i,v_j\rangle=\sum_{k}\sum_{l}a_{ki}a_{lj}\langle e_k,e_l\rangle= \sum_{k}a_{ki}a_{kj} $$ and therefore, if $A=[a_{ij}]$, we have $G=A^TA$.
If $Gx=0$, then also $x^TA^TAx=(Ax)^T(Ax)=0$, so $Ax=0$.
The matrix $A$ is the matrix (with respect to the basis $\{e_1,\dots,e_n\}$) of the linear map defined by $e_i\mapsto v_i$, for $i=1,\dots,n$. The rank of this matrix is $n$ if and only if $\{v_1,\dots,v_n\}$ is linearly independent.
If the set is linearly independent, then $Ax=0$ implies $x=0$; thus $Gx=0$ implies $x=0$ and $G$ has rank $n$.
If the set is not linearly independent, then there is $x\ne0$ with $Ax=0$, so also $Gx=0$ and $G$ is not invertible.
Notes. If the inner product is over the complex numbers, then $G=A^HA$ (the Hermitian transpose) and the argument goes through using the Hermitian transpose instead of the transpose. An orthonormal basis exists because of Gram-Schmidt algorithm.