The Gram-Schmidt method for a skew-symmetric bilinear form

gram-schmidtlinear algebraskew-symmetric matrices

The book of Hoffman and Kunze, Linear Algebra, section 10.3 Thm. 6 describes a method to obtain a representation of a skew-symmetric bilinear form as a block matrix in the form

$$
\begin{bmatrix} J & 0 \\ 0 & 0 \end{bmatrix} \ ,
$$

where the block $J$ is of the form $\begin{bmatrix} 0 & I \\ -I & 0\end{bmatrix}$ (where the sizes are adequate to the specific situation), as some analogous of Gram-Schmidt process, however I'm not quite sure how to apply this method in a specific situation:

Let's say, for example, $B:\mathbb{R}^4\times\mathbb{R}^4\rightarrow\mathbb{R}$ is the form
$$
B(u,v)=(u_3v_1-u_1v_3)+(u_3v_2-u_2v_3)+(u_3v_4-u_4v_3) \ ,
$$

if hadn't messed up any sign here to keep bilinearity and skew-symmetry. If $\{e_j\}$ denotes the canonical basis in $\mathbb{R}^4$ we have, for example, that $B(e_3,e_1)=1$, so according to the method, we take $W$ as the span of $\{e_1,e_3\}$ and then $W^\perp$ as the set of all the $w'\in\mathbb{R}^4$ such that $B(w,w')=0$ for all $w\in W$, and note that $\mathbb{R}^4=W\oplus W^\perp$. As the theorem shows, we can continue this process until $B$ reduces to the $0$ form to obtain a basis such that the matrix of $B$ in this basis will have the desired form, however from the step above I'm not sure how to proceed to complete this basis.

Best Answer

Let $\omega = \omega^1e_1 + \omega^3e_3 \in span\{e_1,e_3\}$ and $\eta = \eta^ie_i \in \mathbb R^4$ then to find $W^\perp$, the $B$-orthogonal complement of $W =span\{e_1,e_3\}$, we write $$ B(\omega,\eta) = \omega^3(\eta^1+\eta^2+\eta^4) - \omega^1\eta^3 = 0 $$ That implies $\eta^3 = 0$ and $\eta^1+\eta^2+\eta^4 = 0$, which means $$ \eta = \eta^1 e_1 + \eta^2 e_2 -(\eta^1+\eta^2) e_4 = \eta^1(e_1-e_4) + \eta^2(e_2-e_4) $$ The vectors $(e_1-e_4)$ and $(e_2-e_4)$ constitute with $e_1$ and $e_3$, (or rather $-e_3$,) a basis in $\mathbb R^4$. It can be shown that $B$ is zero on $W^\perp$, (since it doesn't contain the span of $e_3$, while $e^3$ is a factor in the wedge product).

The dual basis of $\{e_{1'},e_{2'},e_{3'},e_{4'}\}=\{e_1,-e_3,e_1-e_4,e_2-e_4\}$ (in this order) is $\{e^{1'},e^{2'},e^{3'},e^{4'}\}=\{e^1+e^2+e^4,-e^3,-e^2-e^4,e^2\}$ so the form has the representation $B= -e^{2'}\wedge e^{1'} = e^{1'}\wedge e^{2'}$, as in the comment, and $B(e_{i'},e_{j'})$ $$ \pmatrix{ B(1',1') & \cdots & B(1',4')\\ \vdots & \ddots & \vdots\\ B(4',1') & \cdots & B(4',4')\\ } = \pmatrix{ 0 & 1 & 0 & 0\\ -1 & 0 & 0 & 0\\ 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0\\ }\\ $$


About the wedge product of elements of the dual space

the tensor space $V^{*\otimes k}$ is isomorphic to the space $Mult^k(V)$ of all multilinear forms on the finite dimensional space $V$ in the canonical way (extended by linearity) $$ f_1\otimes\cdots\otimes f_k = ((v_1,...,v_k)\mapsto f_1(v_1)\cdots f_k(v_k) \in \mathbb R) $$ a subspace of $Mult^k(V)$ is that $Alt^k(V)$ of all totally skew-symmetric multlinear forms and it corresponds to a subspace $\Lambda^k V^* \subset V^{*\otimes k}$ (called the $k$-th exterior power of $V^*$.)

More about the bases of these spaces and their tensor and wedge products in Spivak, Calculus on Manifolds, Chapter 4, Integration on Chains

Related Question