Orthogonal Gram’s matrix of an arbitrary bilinear form over a real vector space.

bilinear-formlinear algebramatricesorthogonality

Let $V$ be a vector space over $\mathbb{R}$ and let $\beta$ $\colon V \times V \to \mathbb{R}$ be a bilinear form. Let $v_1,\ldots,v_n$ be a basis for $V$ and $B=(b_{ij})$, where $b_{ij} = \beta(v_i,v_j)$. Assume that $B$ is orthogonal, i.e. $B^tB = I_n$. Show that $\beta$ is positive definite if and only if $b_{ii} \geq 0$ for $i = 1,\ldots,n$.

The first direction is easy, it is only a particular case of the positive definite definition:

$$\begin{equation}
\forall v \in V, \beta(v,v) \geq 0
\end{equation}$$

I'm having problems with the other direction and the reason is that I don't know how to use the hypotesis $B^tB = I_{n}$. First, I applied the isomorphism coordinates vector $\phi$ $\colon V \to \mathbb{R}^{n}$ (where $\dim V = n$) on $v\in V$ and think this like a column vector:

$$\phi(v) = \bar{x} = \left(\begin{matrix} x_1 \\ \vdots \\ x_n \end{matrix}\right) $$

Now I use matrix multiplication to find an expression for $\beta(v,v)$, $v\in V$:

$$ \beta(v,v) = \phi(v)^t\cdot B \cdot \phi(v) = \bar{x}^t\cdot B \cdot \bar{x} = \sum_{j=1}^nx_j \sum_{i=1}^nx_i \cdot b_{ji}$$

And I don't know how to advance at this point. I welcome any ideas or other approaches to the problem. Thank you for reading me.

Best Answer

Assume that $B$ is orthogonal, i.e. $B^tB = I_n$. Show that $\beta$ is positive definite if and only if $b_{ii} \geq 0$ for $i = 1,\ldots,n$

This is False. For example

$B:= \left[\begin{matrix}\frac{1}{3} & - \frac{2}{3} & - \frac{2}{3}\\- \frac{2}{3} & \frac{1}{3} & - \frac{2}{3}\\- \frac{2}{3} & - \frac{2}{3} & \frac{1}{3}\end{matrix}\right]= I - \frac{2}{3}\mathbf {11}^T$

$v:= v_1+v_2+v_3\implies -3= \mathbf 1^T B \mathbf 1 = \langle v, v\rangle $