Prove that the Gram-Schmidt orthogonalization (without normalization) preserves the determinant of the $n\times n$ matrix formed by the $n$ linear independent vectors in $\mathbb{R}^n$.
Proof that orthogonalization preserves determinant
determinantlinear algebraorthogonality
Related Solutions
The Gram-Schmidt orthogonalization process is a great thing to learn about because the idea behind it shows up again and again. There are many practical algorithms that use what is essentially a Gram-Schmidt procedure. For example, suppose you have a set $\{v_1, \dots, v_n\}$ of linearly independent vectors (or functions), and you want to approximate another vector (or function) $w$ as a linear combination of the vectors in your set. This can be done by linear regression -- i.e. project $w$ onto the space spanned by the set. An alternative method, which is useful when you have a huge collection of functions in your set, is called "matching pursuit." Here you project $w$ onto the space spanned by the vector $v_i$ that is most correlated with $w$, subtract off that component, then project what remains onto the next most correlated vector from your set, subtract, etc. This process of projecting-subtracting, projecting-subtracting,... is just like Gram-Schmidt, and it is the basic principal behind many practical algorithms for time-frequency decompositions, time-scale (wavelet) decompositions, etc. In other words, there are many "Gram-Schmidt-like" procedures, so you would do well to learn the original!
Hint; your final vectors are not correct. The point of GS it to get an orthogonal set of vectors. Are yours orthogonal? You are starting off with two non orthogonal vectors , that is
$v_1=( 1 , 1 , 1)$ and $v_2= ( 1 , 2 ,1)$
The GS algorithm proceeds as follows;
let $w_1=(1,1,1)$
then we define $$w_2= v_2- \frac{\langle v_1 , w_1 \rangle}{\langle w_1 , w_1 \rangle} w_1$$
$$w_2=(1,2,1)-(4/3,4/3,4/3)=(-1/3,2/3,-1/3)$$ and it can be shown now that the set $$S=\{w_1,w_2\}$$ is orthogonal and also spans the same subspace as the original vectors v.
If we normalize S to say $$S_n=\{(1/3,1/3,1/3),(\frac{-1}{\sqrt6},\sqrt{\frac{2}{3}},\frac{-1}{\sqrt6})\}$$
In general to find the projection matrix P, you first consider the matrix A with your vectors from $S_n$ as columns, that is $$A=\begin{bmatrix} 1/3 & \frac{-1}{\sqrt6} \\ 1/3 & \sqrt{\frac{2}{3}} \\ 1/3 & \frac{-1}{\sqrt6} \\ \end{bmatrix}$$
that is, we will have the orthogonal projection matrix equal to,
$P=A(A^{T}A)^{-1}A^{T}$
Best Answer
It is well-known that adding a row to another, with an arbitrary factor, does not change the value of the determinant (a determinant "cancels" the linear dependencies). And this is precisely what Gram-Schmidt does.
Note that if you performed the normalizations, that would result in the determinant being divided by the product of the norms of the successive vectors. As the determinant of an orthogonal matrix is plus or minus unity, the determinant can come as a byproduct.