You pick an orthonormal basis for $\text{span}(\mathbf{U}, \mathbf{U}^{\perp}) \neq \mathbf{V}$, say $e_j$, $1\leq j\leq n$. Extend this to an orthonormal basis for $\mathbf{V}$, $e_j$, $1\leq j\leq m$. Since $\text{span}(\mathbf{U}, \mathbf{U}^{\perp}) \neq \mathbf{V}$, $n<m$. But then $u=e_{n+1}\perp\text{span}(\mathbf{U}, \mathbf{U}^{\perp})$, by construction, and thus $u\perp \mathbf{U}$, i.e. $u\in \mathbf{U}^\perp$, and thus $u\perp u$, i.e. $\|u\|=0$. This is a contradiction, since $\|u\|=1$.
Find a nonzero solution of
$$
\begin{cases}
u_1^Tx=0 \\
u_2^Tx=0
\end{cases}
$$
Then normalize it, calling $u_3$ the found vector. Then the set $\{u_1,u_2,u_3\}$ is an orthogonal set consisting of three nonzero vectors having norm $1$, hence it is an orthonormal basis.
More generally, if $\{u_1,\dots,u_k\}$ is an orthogonal basis for a subspace of $\mathbb{R}^n$, we can complete it to an orthogonal basis of $\mathbb{R}^n$ as follows.
Consider the matrix $A=\begin{bmatrix}u_1 & \dots & u_k\end{bmatrix}$ having the given vectors as its columns. Find a basis of the null space of $A^T$ or, in other words, the solution space of
$$
\begin{cases}
u_1^T x = 0\\
\cdots\\
u_k^T x = 0
\end{cases}
$$
and call it $\{v_{k+1},\dots,v_n\}$. The null space has indeed dimension $n-k$, because $A^T$ has $n$ columns and the rank of $A$ is $k$. Then orthogonalize the set $\{v_{k+1},\dots,v_n\}$ (with the Gram-Schmidt algorithm or whatever method you prefer) to get the vectors $\{u_{k+1},\dots,u_n\}$. Such vectors are pairwise orthogonal and orthogonal to the given vectors $u_1,\dots,u_k$. Thus
$$
\{u_1,\dots,u_k,u_{k+1},\dots,u_n\}
$$
is an orthogonal set of nonzero vectors, so a basis of $\mathbb{R}^n$. Normalizing it is a standard procedure.
In the case of $\mathbb{R}^3$ a shortcut is to consider $u=u_1\times u_2$ (the vector product), which is orthogonal to both $u_1$ and $u_2$ and nonzero. So just normalizing it is sufficient. However, this uses a very special property of $\mathbb{R}^3$; the vector product cannot be defined in $\mathbb{R}^n$ for $n\ne3$.
Best Answer
For the theorem:
Hint: let $v_{1}, v_{2}, \ldots, v_{k}$ be the vectors in $S$, and suppose there are $c_{1}, \ldots, c_{k}$ such that $v_{1}c_{1} + \cdots + v_{k}c_{k} = 0$. Then take the inner product of both sides with any vector in the set $v_{j}, 1 \leq j \leq k$. Conclude something about the coefficient $c_{j}$ using the fact that $v_{j} \neq 0$ for all vectors $v_{j}$ in the set.
For your next question, orthogonal set implies linearly independent set with the condition that all the vectors in the set are nonzero - we need this in the above proof! (I'll address that in your true false questions).
You're right that linearly independent need not imply orthogonal. To see this, see if you can come up with two vectors which are linearly independent over $\mathbb{R}^{2}$ but have nonzero dot product. (It shouldn't be too hard to do so!)
For your true false question, every orthogonal set need not be linearly independent, as orthogonal sets can certainly include the '$0$' vector, and any set which contains the '$0$' vector is necessarily linearly dependent.
However, every orthonormal set is linearly independent by the above theorem, as every orthonormal set is an orthogonal set consisting of nonzero vectors.