Let $A$ be a symmetric bilinear form on $\mathbb R^n$.For the basis $\{\alpha_1,…,\alpha_n\}$ we have $A(x,y)=\sum\limits_i\sum\limits_j A(\alpha_i,\alpha_j)x_iy_j$.My question is how to find a new basis $\{\beta_1,…,\beta_n\}$ such that $A(x,y)=\sum\sum A(\beta_i,\beta_i)x_iy_i$.In particular for the bilinear form $A(x,y)=x_1y_1+2x_2y_2+x_1y_2+x_2y_1$ how to find a basis with respect to which the bilinear form will have no cross product term i.e. $x_iy_j$ for $i\neq j$.I think the process is to find an orthogonal basis obtained by Gram-Schmidt orthogonalization,am I correct? I think the basis $\{(1,0),(-1,1)\}$ is the required one.
Eliminating cross product terms of a symmetric bilinear form.
bilinear-formlinear algebraquadratic-formssymmetric matrices
Related Solutions
Thanks to Daniel Fischer, making everything slightly more verbose:
Existence: The bilinear form gives $\Phi : W \to V^*$, where $\Phi(w) = \lambda_w \in V^*$, for $\lambda_w(v) = (v, w)$. This is a linear map, by linearity of $(\cdot, \cdot)$. It is injective by non-degeneracy of $(\cdot, \cdot)$, and since $\dim W = \dim V^*$, it must also be surjective. A bijective linear map is an isomorphism, so we have a linear inverse $\Phi^{-1}$. Note for any $w \in W$ we have $$(v, \Phi^{-1}(\lambda_w)) = (v, w) = \lambda_w(v)$$ and since $\Phi$ is surjective this works for any $\lambda \in V^*$.
We can construct the dual basis of $(\alpha_1, \dots, \alpha_n)$ in $V^*$ by setting $\lambda_i(\alpha_j) = \delta_{i j}$ and taking $(\lambda_1, \dots, \lambda_n)$ (see for example Prop 1.0.2 of these notes).
Now, $(\beta_1, \dots, \beta_n)$, where $\beta_i = \Phi^{-1}(\lambda_i)$, gives a basis for $W$ that satisfies the required duality: $$ (\alpha_i, \beta_j) = (\alpha_i, \Phi^{-1}(\lambda_j)) = \lambda_j(\alpha_i) = \delta_{i j} $$ It is a basis because we're applying an isomorphism to elements that also constitute a basis.
Uniqueness: Suppose $(\beta_1', \dots, \beta_n')$ is another basis of $W$ satisfying the duality condition. Since the $\beta_1, \dots, \beta_n$ span $W$ we can write $\beta_j' = \sum_k c_{j k} \beta_k$ for all indices $j$. Applying $(\alpha_i, \cdot)$ to that equation we get: $$\delta_{j i} = \delta_{i j} = (\alpha_i, \beta_j') = \sum_k c_{j k} (\alpha_i, \beta_k) = c_{j i}$$ hence $\beta_j' = \beta_j$ for all indices $j$.
Your proof is incomplete, since you have not shown that every element in the span of $\{\beta_1, ..., \beta_n\}$ is also contained in $R_T$.
To elaborate:
By showing that for any arbitrary $y ā R_T$
it is also true that $yā $ span$~(\beta_1, ..., š·_n)$,
you have shown that $R_T ā $ span$~(\beta_1, ..., \beta_n)$.
To prove equality, we need to show the reverse inclusion as well: Given an element $zā $ span$(\beta_1, ..., \beta_n)$ we need to show that it is an element of $R_T$.
Best Answer
As Jean Marie indicated in the comments, this can be solved by diagonalizing the matrix of $A$, which is $$M=\begin{pmatrix}1&1\\1&2\end{pmatrix}$$ To do this, you can first compute the eigenvalues as the roots of the characteristic polynomial $$\det(\lambda I-M)=\begin{vmatrix}\lambda-1&-1\\-1&\lambda-2\end{vmatrix}=(\lambda-1)(\lambda-2)-(-1)(-1)=\lambda^2-3\lambda+1$$ The roots are just $\lambda_1=\tfrac{1}{2}(3+\sqrt{5})$ and $\lambda_2=\tfrac{1}{2}(3-\sqrt{5})$. Then solve the corresponding linear systems $Mv=\lambda_1v$ and $Mv=\lambda_2v$ to obtain the eigenvectors $v_1=\bigl(\tfrac{1}{2}(-1+\sqrt{5}),1\bigr)$ and $v_2=\bigl(\tfrac{1}{2}(-1-\sqrt{5}),1\bigr)$. The basis $v_1,v_2$ diagonalizes $A$. For example, you can verify that $$A(v_1,v_2)=\frac{1}{4}(-4)+2-\frac{1}{2}+\frac{\sqrt{5}}{2}-\frac{1}{2}-\frac{\sqrt{5}}{2}=0$$ and similarly $A(v_2,v_1)=0$.