By choosing a basis, one may assume wlog that $v_1 = (a,0,\ldots,0)^\top$ and $v_2 = (b,c,0,\ldots,0)^\top$ where $a = \|v_1\|$, $b = \langle v_1,v_2 \rangle/a$, $c = \sqrt{\|v_2\|^2 - b^2}$. Note that $a,b,c$ are all nonzero. Then $S$ has its top left $2 \times 2$ submatrix $$\pmatrix{a^2 + b^2 & bc \cr bc & c^2\cr}$$, the other entries being all $0$. We may as well ignore the rest of the matrix and assume $N=2$. The nonzero eigenvalues are the roots of the characteristic polynomial $\lambda^2 - (a^2 + b^2 + c^2) \lambda + a^2 c^2$. They are indeed distinct since $(a^2 + b^2 + c^2)^2 - 4 a^2 c^2 = (a^2 - c^2)^2 + 2 b^2 (a^2 + c^2) + b^4 > 0$. The eigenvector for eigenvalue $r$ is $x = (x_1,x_2)^\top$ where $(a^2 + b^2 - r) x_1 + b c x_2 = 0$.
Now $\langle x, v_1 \rangle = a x_1 \ne 0$. By symmetry (we could have presented $v_1$ before $v_2$), $\langle x, v_2 \rangle \ne 0$ as well.
Let $v_i=\sum_{k=1}^n a_{ki}e_k$, where $\{e_1,\dots,e_n\}$ is an orthonormal basis. Then
$$
\langle v_i,v_j\rangle=\sum_{k}\sum_{l}a_{ki}a_{lj}\langle e_k,e_l\rangle=
\sum_{k}a_{ki}a_{kj}
$$
and therefore, if $A=[a_{ij}]$, we have $G=A^TA$.
If $Gx=0$, then also $x^TA^TAx=(Ax)^T(Ax)=0$, so $Ax=0$.
The matrix $A$ is the matrix (with respect to the basis $\{e_1,\dots,e_n\}$) of the linear map defined by $e_i\mapsto v_i$, for $i=1,\dots,n$. The rank of this matrix is $n$ if and only if $\{v_1,\dots,v_n\}$ is linearly independent.
If the set is linearly independent, then $Ax=0$ implies $x=0$; thus $Gx=0$ implies $x=0$ and $G$ has rank $n$.
If the set is not linearly independent, then there is $x\ne0$ with $Ax=0$, so also $Gx=0$ and $G$ is not invertible.
Notes. If the inner product is over the complex numbers, then $G=A^HA$ (the Hermitian transpose) and the argument goes through using the Hermitian transpose instead of the transpose. An orthonormal basis exists because of Gram-Schmidt algorithm.
Best Answer
Take the equation $$ AX-XA=C $$ Pick $v_i, v_j$ and calculate the following: $$ \langle v_i,(AX-XA)v_j\rangle=\langle v_i,AXv_j\rangle-\langle v_i,XAv_j\rangle $$ Since $A$ is symmetric, we can move it to $v_i$ in the first term. So we get: $$ \langle v_i,(AX-XA)v_j\rangle=(\lambda_i-\lambda_j)\langle v_i,Xv_j\rangle $$ Motivated by this, we define $X$ by $$ \langle v_i,Xv_j\rangle=\frac{1}{\lambda_i-\lambda_j}\langle v_i,Cv_j\rangle $$ if $i\neq j$ and $0$ otherwise. This makes $X$ well-defined since $\langle v_i,Xv_j\rangle$ are its components in an orthogonal basis. Now we have: $$ \langle v_i,(AX-XA)v_j\rangle=(\lambda_i-\lambda_j)\langle v_i,Xv_j\rangle=\langle v_i,Cv_j\rangle $$ Since $\langle v_i,(AX-XA)v_j\rangle$ are just the components of $AX-XA$ in some orthogonal basis and they are equal to the components of $C$ in this same basis, $AX-XA = C$.