You want to find the last row of an orthogonal matrix given $n-1$ rows, right? Since the sum of squares in each column is $1$, you can find the absolute values of the entries in about $n^2$ operations. So, it remains to determine the sign pattern for non-zero entries. That can be also done quickly: add two columns with non-zero last entries. The sum should have norm 2 and this is enough to determine if the signs of the last entries were the same or different because $(a+b)^2\ne(a-b)^2$. Of course, this works only if the $n-1$ orthonormal vectors are known with reasonable precision but it requires no random choice.
Edit: All right. Here goes the formal algorithm as I would try it in a real implementation:
Let $v_i=(v_{ij})$ be the given vectors ($i=1,\dots,n-1$, $j=1,\dots,n$).
Put $v_{nj}=\sqrt{1-\sum_{i=1}^{n-1}v_{ij}^2}$ (return $0$ if the expression under the root is slightly negative and write an error message if it is noticeably negative).
Choose $J$ such that $v_{nJ}=\max_j v_{nj}$.
For $j\ne J$ compute $S=\sum_{i=1}^{n-1} v_{ij}v_{iJ}$. If $S\le 0$, leave $v_{nj}$ as is, otherwise change its sign to $-$. After that is done, for control, compute $T=v_{nj}v_{nJ}$ and check that $S+T=0$ with decent precision. If not, write an error message.
Take two or three scalar products of thus obtained $v_n$ with previous vectors (say, $v_1$ and $v_{n/2}$) and check the norm of $v_n$. If those tests pass with decent precision, consider the task done.
I know, I'm sort of paranoid about possible errors in my programming but it you'd better detect the situation when you try this on a system of vectors that is not quite orthonormal :).
The most natural way to view your set of vectors is in the setting of projective geometry; the vector space $K^n$ (where $K$ is now an arbitrary field, so $K = \mathbb{C}$ for you) can be seen as a projective space $\mathbb{P}^{n-1}(K)$ of dimension $n-1$. (The "points" of the projective space $\mathbb{P}^{n-1}(K)$ correspond to the vector lines $\{ \lambda v \mid \lambda \in K \}$ of $K^n$).
In your example, the set of isotropic vectors for the bilinear form $\langle u, v \rangle = u^T v$ correspond to a quadric in $\mathbb{P}^{n-1}(K)$. See, for instance, http://en.wikipedia.org/wiki/Quadric_(projective_geometry). Note that over an algebraically closed field such as $\mathbb{C}$, there is essentially only one non-degenerate bilinear form, so also only one quadric.
Best Answer
The maximum is $cn^{4/3}$ for some constant $c$.
We may as well assume all our points are on the unit sphere $S$. Let $P$ be some plane not containing the origin, which we might think of as being far away. For each point $x\in S$ in our collection let $p_x$ be the intersection of the line through $0$ and $x$ with $P$ and let $\ell_x$ be the intersection of the hyperplane orthogonal to $x$ with $P$. Unless $P$ was chosen by your enemies then all these things are well defined. Note that $p_x\in\ell_y$ iff $x$ and $y$ are orthogonal, so orthogonality in our original collection becomes incidence in our new collection, and by the Szemeredi-Trotter theorem there are at most $O(n^{4/3})$ incidences.
To see that there is a construction with this many orthogonal pairs, take some example which shows that Szemeredi-Trotter is tight, and such that there are about as many points as lines, and read the above paragraph backwards. If $P$ is far away then this example will consist of two collections of about $n/2$ points carefully clustered around some two orthogonal points of $S$.
I read this construction in the following paper of Erdos, Hickerson, and Pach, which also contains references to many other things. See Theorem 2(ii) for this problem.
http://www.renyi.hu/~p_erdos/1989-02.pdf