Algorithm for diagonalization of a bilinear form

bilinear-formeigenvalues-eigenvectorslinear algebramatrices

I've got introduced to diagonalization for linear transformation ( eigenvalues, eigenvectors, algebraic/geometric multiplicity… ) and everything was quite clear, you've got the expression $Av = \lambda v$ that means you need to find the eigenvalue $\lambda$ , the expression becomes $\det(A-I\lambda)$ etc.

Now with Bilinear forms, i've been told to follow a strange algorithm that operates by finding one by one the elements of the diagonal and makes the other elements in the column and row of that diagonal element becomes 0 (image 1 , image 2, image 3 that's my prof writing, by the way)

Does someone know what algorithm is that? is there a more intuitive way like with linear transformation? i've read about someone mentioning eigenvalues to diagonalize bilinear forms, is it possible?

usually when I'm into new stuff I like to understand what I'm dealing with
but this time I've not found much, i don't know even what exactly represents a matrix of a bilinear form

Thanks for reading this far.

Best Answer

This is not strange as it looks, if you know a bit of theory.

I am assuming that your bilinear form is symmetric, and that your base field has odd characteristic , otherwise the result does not hold.

The main ingredient is the following.

Thm. Let $(V,b)$ be a symmetric bilinear form. If $F$ is a subspace such that the restriction of $b$ to $F\times F$ is nondegenerate, then $E=F\oplus F^\perp$.

I will prove a particular case of this result to enlight the algorithm you are studing.

Lemma. Assume that $x_0\in E$ satisfies $b(x_0,x_0)\neq 0$, and let $F=Kx_0$ (where $k$ is the base field). Then $E=F\oplus F^\perp$.

Proof. Let us guess the decomposition of $x\in E$. If $x=x_F+x_{F^\perp}$, then $x_F=\lambda x_0$ by definition of $F$, and $b(x,x_0)=\lambda b(x_0,x_0)+ b(x_0,x_{F^\perp})$. Now $x_0\in F$ so the second term is $0$. Hence $b(x,x_0)=\lambda b(x_0,x_0)$. Hence $\lambda=\dfrac{b(x,x_0)}{b(x_0,x_0)}$. This proves that if the decomposition exists, it is unique. Conversely, for this value of $\lambda$, one may easily show that $x_F= \lambda x_0$ and $x_{F^\perp}=x-\lambda x_0$ satisy the required conditions.

Now the algorithm is clear:

One may assume that $b\neq 0$ (otherwise, the result is clear)

Then the polarization formula $b(x,y)=\dfrac{1}{2}(b(x+y,b+y)-b(x,x)-b(y,y))$ shows that if $b\neq 0$, then there exists at least one $e_1\in E$ such that $b(e_1,e_1)\neq 0$. Then if $F=k e_1$, $E=F\oplus F^\perp$. By induction, there is a basis $(e_2,\ldots,e_n)$ of $F^\perp$ which is $b$-orthogonal. Then $(e_1,\ldots,e_n)$ is a $b$-orthogonal basis.

In other words: if the restriction of $b$ to $F^\perp\times F^\perp$ is zero, pick any basis $(e_2,\ldots,e_n)$ of $F^\perp.$ Otherwise, pick $e_2\in F^\perp$ such that $b(e_2,e_2)\neq 0$ and go on....

There is another algorithm : try to Google Gauss algorithm for quadratic forms, or something like that...

Related Question