Let us denote the set of $n$ orthonormal vectors in $\mathbb{R}^n$ as follows:
$$S = \{e_1, e_2, \ldots e_n\}$$
where $e_i = (e_{1i}, e_{2i}, \ldots e_{ni}) \in \mathbb{R}^n$ and $e_i, e_j$ are orthogonal if $i \neq j$.
Consider a the following linear combination:
$$0 = a_1e_1 + a_2 e_2 + \ldots + a_ne_n$$
where the $a_i \in \mathbb{R}$ and $0 = (0,0, \ldots, 0)$. If we can show that all $a_i$ are zero, we find that the vectors in $S$ are linearly independent. Let us use the orthogonality of the vectors in this set to do so. Consider the following inproduct (I will denote the inproduct by $\cdot$:
$$e_i \cdot (a_1e_1 + a_2 e_2 + \ldots + a_ne_n).$$
Because of linearity, we find that this is equal to
$$a_1 (e_i \cdot e_1) + \ldots a_i (e_i \cdot e_i) + \ldots a_n (e_i \cdot e_n).$$
Because $e_i$ is orthogonal to every $e_j$ with $i \neq j$, we have that $e_i \cdot e_j = 0$ if and only if $i \neq j$. For $e_i \cdot e_i$, we have that this is non-zero (since the vectors in $S$ are orthonormal, we have that $e_i \cdot e_i = 1$, try to find this yourself). Hence we find that
$$a_1 (e_i \cdot e_1) + \ldots a_i (e_i \cdot e_i) + \ldots a_n (e_i \cdot e_n) = 0 + \ldots + a_i + \ldots 0 = a_i.$$
However, we have that $e_i \cdot (0,0, \ldots, 0) = 0$, so we find that $a_i = 0$ and this holds for every $i \in \{1, \ldots, n\}$. So the vectors in $S$ are linearly independent.
Since the dimension of $\mathbb{R}^n$ is $n$, we have that $S$ forms a basis of this vectorspace.
$\textbf{Edit:}$ since this proof only uses that $e_i$ are of norm 1 in order to show that $e_i \cdot e_i \neq 0$, the same proof holds to show that any set of orthogonal vectors is lineary independent!
Suppose the vectors $v_1,\dots,v_k$ are linearly dependent.
Then there exist non-trivial (not all zero) coefficients $a_1,\dots,a_k$ so that
$$
a_1v_1+\dots+a_kv_k=0.
$$
Now express this in the basis $B$:
$$
0
=
[a_1v_1+\dots+a_kv_k]_B
=
a_1[v_1]_B+\dots+a_k[v_k]_B.
$$
Therefore the vectors $[v_1]_B,\dots,[v_k]_B$ are linearly dependent.
If you assume that the vectors $[v_1]_B,\dots,[v_k]_B$ are linearly dependent, you can follow the same steps backwards to show that $v_1,\dots,v_k$ are linearly dependent.
We have shown that one set of vectors is linearly dependent if and only if the other one is.
Therefore one set of vectors is linearly independent if and only if the other one is.
Best Answer
Assume they don't span $\mathbb{R}^n$. Then you can keep adding vectors $u_1,..u_k$ so that at each step, $v_1,...,v_n, u_1,...,u_k$ is free. Since $\mathbb{R}^n$ is finite dimensional, this process stops eventually, so that $v_1,...,v_n,u_1,...u_k$ is free and spans $\mathbb{R}^n$, so it's a basis
But a well-known theorem asserts that any two bases of a (finite dimensional - or with the axiom of choice, any) vector space have the same cardinality, so that $n+k = n$, $k=0$ : $v_1,...,v_n$ spans $\mathbb{R}^n$.
For a more detailed proof you can look up the incomplete basis theorem, which asserts that in a finite dimensional vector space, any free family can be expanded to a basis (actually this is true of any vector space if you add the axiom of choice)