There doesn't seem to be any real reason for avoiding the Gram-Schmidt algorithm in this situation. The first two steps terminate easily and the last one is not very complicated.
Start with $u_1=(1,0,0)$ since it is obviously unit length with respect to this form.
Then $(0,1,0)$ turns out to be already perpendicular to this, but it needs to be rescaled because its length is $\sqrt{2}$. Thus $u_2=(0,\frac{1}{\sqrt{2}},0)$.
Finally, the only real calculation is involved when you try to process $(0,0,1)$. Taking away the contributions of $u_1$ and $u_2$, you're left with $(-1,-\frac{1}{2},1)$, whose square norm is $\frac{-1}{2}$, so it would be appropriate to use $u_3=(-\sqrt{2},-\frac{1}{\sqrt{2}},\sqrt{2})$ so that $u_3\cdot u_3=-1$.
That gives an "orthonormal" basis: they are pairwise orthogonal and $u\cdot u\in\{1,-1\}$ for all $u$ in the basis, which is to be expected since the signature of this metric is +,+,-.
The norm of the vectors can't be zero as they both are non-zero vectors. Assuming you're working with the usual euclidean inner (scalar) product, we have:
$$||(i,1)||^2=\langle (i,1)\,,\,(i,1)\rangle=i\cdot\overline i+1\cdot\overline 1=i(-i)+1\cdot 1=2\\||(-i,1)||^2=\langle (-i,1)\,,\,(-i,1)\rangle=(-i)\cdot\overline{(-i)}+1\cdot\overline 1=(-i)(i)+1\cdot 1=2$$
so both vectors' norm is $\;\sqrt2\;$ ...
Best Answer
Hint: take a normalized eigenvector $u_1$, i.e. $Au_1=\lambda u_1$, and complete it to an ON-base by $u_2$, $u_3$. Then $U=[u_1\ u_2\ u_3]$ is one choice of the matrix. It is easy to see that $$ A[u_1\ u_2\ u_3]=[u_1\ u_2\ u_3]\begin{bmatrix}\lambda &*&*\\0&*&*\\0&*&*\end{bmatrix}. $$
P.S. Without solving any characteristic equations, $\lambda=5$ seems to be the most obvious choice of an eigenvalue of $A$ with very easy $u_1$.