The choice of inner product defines the notion of orthogonality.
The usual notion of being "perpendicular" depends on the notion of "angle" which turns out to depend on the notion of "dot product".
If you change the way we measure the "dot product" to give a more general inner product then we change what we mean by "angle", and so have a new notion of being "perpendicular", which in general we call orthogonality.
So when you apply the Gram-Schmidt procedure to these vectors you will NOT necessarily get vectors that are perpendicular in the usual sense (their dot product might not be $0$).
Let's apply the procedure.
It says that to get an orthogonal basis we start with one of the vectors, say $u_1 = (-1,1,0)$ as the first element of our new basis.
Then we do the following calculation to get the second vector in our new basis:
$u_2 = v_2 - \frac{\langle v_2, u_1\rangle}{\langle u_1, u_1\rangle} u_1$
where $v_2 = (-1,1,2)$.
Now $\langle v_2, u_1\rangle = 3$ and $\langle u_1, u_1\rangle = 3$ so that we are given:
$u_2 = v_2 - u_1 = (0,0,2)$.
So your basis is correct. Let's check that these vectors are indeed orthogonal. Remember, this is with respect to our new inner product. We find that:
$\langle u_1, u_2\rangle = 3(-1)(0) + (1)(0) + 2(0)(2) = 0$
(here we also happened to get a basis that is perpendicular in the traditional sense, this was lucky).
Now is the basis orthonormal? (in other words, are these unit vectors?). No they arent, so to get an orthonormal basis we must divide each by its length. Now this is not the length in the usual sense of the word, because yet again this is something that depends on the inner product you use. The usual Pythagorean way of finding the length of a vector is:
$||x||=\sqrt{x_1^2 + ... + x_n^2} = \sqrt{x . x}$
It is just the square root of the dot product with itself. So with more general inner products we can define a "length" via:
$||x|| = \sqrt{\langle x,x\rangle}$.
With this length we see that:
$||u_1|| = \sqrt{2(-1)(-1) + (1)(1) + 3(0)(0)} = \sqrt{3}$
$||u_2|| = \sqrt{2(0)(0) + (0)(0) + 3(2)(2)} = 2\sqrt{3}$
(notice how these are different to what you would usually get using the Pythagorean way).
Thus an orthonormal basis is given by:
$\{\frac{u_1}{||u_1||}, \frac{u_2}{||u_2||}\} = \{(\frac{-1}{\sqrt{3}}, \frac{1}{\sqrt{3}}, 0), (0,0,\frac{1}{\sqrt{3}})\}$
Orthonormal bases are nice because several formulas are much simpler when vectors are given wrt an ON basis.
Example: Let $\mathcal E = \{e_1, \dots, e_n\}$ be an ON basis. Then the Fourier expansion of any vector $v\in\operatorname{span}(\mathcal E)$ is just $$v = (v\cdot e_1)e_1 + (v\cdot e_2)e_2 + \cdots + (v\cdot e_n)e_n$$
Notice that there are no normalization factors and we don't need to construct a dual basis -- it's just a really simple formula.
In your example, of course $\{(1,0),(0,1)\}$ spans the same space as $\{(3,2),(2,2)\}$. But let me provide an example of my own: what about $\{(1.1,1.2,0.9,2.1,4),(3,-2,6,14,2),(6,6,6,3.4,11.1)\}$? There's certainly no subset of the standard basis vectors that spans the same space as these linearly independent vectors. But this is a pretty poor choice of basis because they're not orthonormal. It'd sure be nice if we had some algorithm that could produce as ON basis from them...
Best Answer
You need two arbitrary linearly independent vectors that lie in the plane. You chose two such vectors, so you've done fine there. You could systematically find such a set in the usual way that you would find the kernel to a linear transformation (i.e. setting "free variables" to certain values etc.).
And yes, everything else that you've done is perfect. Just normalize the vectors and you're all set!