Proof about change of basis

linear algebralinear-transformations

Let $V$ be a finite-dimensional vector space over a field $F$, and let $\beta = \{x_1,x_2,\dots,x_n\}$ be an ordered basis for $V$. Let $Q$ be an $n \times n$ invertible matrix with entries from $F$. Define $x'_j=\sum_{i=1}^n Q_{ij}x_i$ for $1 \le j \le n$, and set $\beta' = \{ x'_1,x'_2,…,x'_n \}$. Prove that $\beta'$ is a basis for $V$ and hence $Q$ is the change of coordinate matrix changing $\beta'$-coordinates.

To prove $\beta'$ is a basis need to prove that $x'_i$ is linearly independent. We can let $\sum_{i=1}^n a_i x'_i=0$, then $\sum_{i=1}^n a_i Q_{ij} x'_i=0$ Since $x_i$ is a basis, $a_i Q_{ij}=0$. Since $Q$ is invertible, by definition, we can find $Q^{-1}$ such that $a_i Q_{ij} Q^{-1}_{ij} = 0 \cdot Q^{-1}_{ij}$ so $a_i=0$.

Thus proved.

Is it enough?

Best Answer

$Q_{ij}$ is the element in the $ij$-th entry of $Q$, it is not a matrix, so, the proof that you give is not enough.

Assume that there exists scalars $a_1,\dots,a_n$ such that the zero vector can be written as $$\begin{align} \textbf0 = \sum_{i=1}^n a_jx_j' &= \sum_{j=1}^n a_j \Big( \sum_{i=1}^n Q_{ij}x_i \Big) \\ &= \sum_{i=1}^n \color{red}{\Big( \sum_{j=1}^n a_j Q_{ij} \Big)} x_i = \sum_{i=1}^n \color{red}{b_i} x_i. \end{align}$$ Now, since the vectors $x_1,\dots,x_n$ are linearly indepent, $$b_i = \sum_{j=1}^n a_j Q_{ij} = 0$$ for each $i$ between $1$ and $n$, and this gives us a system of linear equations that can be written as $$Q \begin{pmatrix} a_1 \\ a_2 \\ \vdots \\ a_n \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \\ \vdots \\ 0 \end{pmatrix}.$$ So, since $Q$ is invertible, it then follows that $$\begin{pmatrix} a_1 \\ a_2 \\ \vdots \\ a_n \end{pmatrix} = Q^{-1}\begin{pmatrix} 0 \\ 0 \\ \vdots \\ 0 \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \\ \vdots \\ 0 \end{pmatrix}$$ that is, $a_1 = a_2 = \cdots = a_n = 0$.

Related Question