Let V be a finite dinner-product space, given a$ v_1,…v_n$ orthogonal bases, and $w_1,…w_n \in W $be subspace of V. Then $v_1=w_1$, $v_2= w_2-proj_w v_2$, $v_3= w_3-proj_w v_2- proj_w v_3$… I know that $proj_w v_i$ is the vector $v_i$ projected onto W subspace. How to intuitively understand Gram-Schmidt process, why as i gets bigger, we keep on subtracting more? I looked up the diagram on Gram-Schmidt, but still fails to get the sense.
Intuition for Gram-Schmidt process
gram-schmidtlinear algebra
Related Solutions
Your original vector $w_3$ is a linear combination of the previous two; in fact, $w_3 = w_1 + w_2$. Whenever that happens, the Gram-Schmidt process will spit out the zero vector. (Because $v_3$ will be forced to be in the span of $w_1$ and $w_2$, but also orthogonal to $w_1$ and $w_2$, the only possibility for $v_3$ is $0$.)
Go back and produce a basis for your subspace, then apply the Gram-Schmidt process and you'll have an orthogonal basis as desired.
Your $u_1$ is correct, but your $u_2$ is incorrect; as noted in the comments, $u_1\cdot u_2 \neq 0$.
Recall that the standard inner product on $\mathbb{C}^n$ is given by $\langle u, v\rangle = u\cdot\bar{v}$. With this in mind, let's calculate $u_2$. First we have
$$\langle v_2, u_1\rangle = (-1, i, 1)\cdot\overline{\frac{1}{\sqrt{2}}(1, 0, i)} = \frac{1}{\sqrt{2}}(-1, i, 1)\cdot(1, 0, -i) = \frac{-1-i}{\sqrt{2}},$$
so
$$w_2 = v_2 - \langle v_2, u_1\rangle u_1 = \left[\begin{array}{c} -1\\ i\\ 1\end{array}\right] + \frac{1+i}{\sqrt{2}}\frac{1}{\sqrt{2}}\left[\begin{array}{c} 1\\ 0\\ i\end{array}\right] = \left[\begin{array}{c} \frac{-1+i}{2}\\ i\\ \frac{1+i}{2}\end{array}\right].$$
As
\begin{align*} \|w_2\|^2 &= \sqrt{\left|\frac{-1+i}{2}\right|^2 + |i|^2 + \left|\frac{1+i}{2}\right|^2}\\ &= \sqrt{\left(\frac{-1}{2}\right)^2 + \left(\frac{1}{2}\right)^2 + 0^2 + 1^2 + \left(\frac{1}{2}\right)^2 + \left(\frac{1}{2}\right)^2}\\ &= \sqrt{2} \end{align*}
we have
$$u_2 = \frac{1}{\|w_2\|}w_2 = \frac{1}{\sqrt{2}}\left[\begin{array}{c} \frac{-1+i}{2}\\ i\\ \frac{1+i}{2}\end{array}\right].$$
Let's check to see if $u_2$ is orthogonal to $u_1$:
\begin{align*} \langle u_1, u_2\rangle &= \frac{1}{\sqrt{2}}(1, 0, i)\cdot\overline{\frac{1}{\sqrt{2}}\left(\frac{-1+i}{2}, i, \frac{1+i}{2}\right)}\\ &= \frac{1}{2}(1, 0, i)\cdot\left(\frac{-1-i}{2}, -i, \frac{1-i}{2}\right)\\ &= \frac{1}{2}\left(\frac{-1-i}{2} + 0 + \frac{1+i}{2}\right)\\ &= 0. \end{align*}
Note, we didn't have to normalise before we checked orthogonality; i.e. we could have checked $\langle u_1, w_2\rangle = 0$ instead.
I won't do the calculation of $u_3$ now. It is similar, but there are more computations. Note however that you have a typo in your formula for $w_3$; it should be
$$w_3 = v_3 - \langle v_3, u_1\rangle u_1 - \langle v_3, u_2\rangle u_2.$$
Now that you have the correct $u_2$ and the correct formula for $w_3$, the computation for $u_3$ should work out and produce $u_3 = \frac{1}{2}(i, -1-i, 1)$.
Best Answer
When you project a vector onto a subspace (any dimension), and subtract the result, you get a vector orthogonal to the subspace. Projection onto an $n$-dimensional subspace, on the other hand, can be accomplished by adding the projections onto the $n$ individual members of an orthogonal basis.
Perhaps try an example. Project $(x,y,z)$ onto the $2$-dimensional subspace spanned by the unit vectors in the directions of the $x$ and $y$ axes. These two are often denoted $\bf{\vec i}$ and $\bf{\vec j}$. The projection in this case is of course $(x,y,0)$. And that's indeed $(x,y,z)-(x,0,0)-(0,y,0)=(x,y,z)-\operatorname{proj}_{\bf{\vec i}}(x,y,z)-\operatorname{proj}_{\bf{\vec j}}(x,y,z)$. After subtracting, and normalizing, you of course get $\bf{\vec k}=(0,0,1)$, to complete the standard basis.