Maybe
this will help:
How about finding a basis containing that vector, extending the basis to a basis for the space, and then applying Gram-Schmidt to get an orthonormal basis?
One way to do this is by using the fundamental theorem of linear algebra: http://en.wikipedia.org/wiki/Fundamental_theorem_of_linear_algebra . Using the 4d-vector as the first row of a $4 \times 4$ matrix $M$ , then finding the nullspace of that matrix, since ker$M$ =orthogonal complement of the row space. This will extend the 4d-vector into a basis for $\mathbb R^4$
Then you can apply Gram-Schmidt algorithm will give you three vectors orthogonal to the original vector.
I like your idea about finding a vector that can't be written as a sum of the two vectors above. Let's take a look at what that would look like.
Every possible sum of these two vectors can be expressed as $c_{1}(1,2,8) + c_{2}(0,1,9)$ for some $c_{1}, c_{2}$ in $\Bbb R$. So, all possible sums can be expressed in the form $(c_{1}, 2c_{1} + c_{2}, 8c_{1} + 9c_{2})$.
We want to come up with a third vector $(v_{1}, v_{2}, v_{3})$ that can't be expressed in the above form. Whatever $c_{1}$ and $c_{2}$ you pick for the linear combination above, we need that the third component $v_{3}$ is exactly $8c_{1} + 9c_{2}$. Let's pick a vector whose third component is different from this (i.e., pick $c_{1}$ and $c_{2}$, and fill in the first two components of $(c_{1}, 2c_{1} + c_{2}, 8c_{1} + 9c_{2})$, but make the third component different from this).
So, even though you can pick $c_{1}$ and $c_{2}$ to be anything, I will pick $c_{1} = c_{2} = 1$. Then the vector I will construct will be:
$(1c_{1}, 2c_{1} + 1c_{2}, 11) = (1, 2 + 1, 11) = (1, 3, 11)$
Notice that I made the third component different from $8c_{1} + 9c_{2} = 8 + 9 = 17$. Then this new vector can't be written as a linear combination of the previous two vectors, because that's how we constructed it.
Best Answer
Yes, this is always possible. Choose an arbitrary vector $\vec{w}$ that lies in $V \setminus \text{span}(\vec{a}_1,\dots,\vec{a}_m)$. The orthogonal projection of $\vec{w}$ on $\vec{a}_1$ is \begin{equation} \vec{w}_1=\frac{\vec{a}_1 \cdot \vec{w}}{\vec{a}_1 \cdot \vec{a}_1} \vec{a}_1 = \frac{\vec{a}_1 \cdot \vec{w}}{\left| \vec{a}_1 \right|^2} \vec{a}_1 \end{equation} You can check that the vector $\vec{w}-\vec{w}_1$ is orthogonal to $\vec{a}_1$.
In the same manner, project $\vec{w}-\vec{w}_1$ orthogonally on $\vec{a}_2$: \begin{equation} \vec{w}_2=\frac{\vec{a}_2 \cdot (\vec{w}-\vec{w_1})}{\left| \vec{a}_2 \right|^2} \vec{a}_2 \end{equation}
You can again check that the vector $(\vec{w}-\vec{w}_1)-\vec{w}_2$ is orthogonal to $\vec{a}_2$. However, there is no garantee that $(\vec{w}-\vec{w}_1)-\vec{w}_2$ is orthogonal to $\vec{a}_1$, except if the given vectors $\vec{a}_1,\dots,\vec{a}_m$ are already orthogonalised (which can be done by the Gram-Schmidt process).
Repeat tis strategy: calculate $\vec{w}_3,\vec{w}_4,\dots,\vec{w}_m$ in the same way. The vector you were looking for is $\vec{w}-\vec{w}_1-\vec{w}_2-\dots-\vec{w}_m$.
I hope this answer is helpful for you.