A doubt in section 2.4 from Hoffman and Kunze Linear Algebra

change-of-basislinear algebra

I have seen the older posts regarding this topic but that didn't clarify what I want to ask here from the below theorem in the book.
enter image description here

Firstly, how did it assume that another basis already exists? Secondly, when and where did it prove the uniqueness of that basis? Thirdly, what is 'k' that is used while multiplying Q? I think that I didn't understand the proof at all. Can someone please explain what exactly is going on here? I would really appreciate it as I am reading this book on my own.

Best Answer

Here are responses to your questions, but in the order that follows the proof structure.

  1. Uniqueness is explained in the second sentence. Any basis $\{\alpha'_1,\ldots,\alpha'_n\}$ satisfying (i) must take the form of $\alpha'_j$ in the displayed equation, for all $1\leq j\leq n$. So such a basis, if it exists, is uniquely determined.

  2. Existence is explained in the third sentence. If you define $\alpha'_j$ as in the preceding equation then these elements certainly exist. The rest of the argument shows that the make a basis. The final sentences of the proof explain why this basis satisfies the desired conditions (i) and (ii).

  3. $j$ and $k$ are indices between $1$ and $n$ (as is $i$ later), and $Q_{jk}$ is the corresponding entry of $Q$. The computation shows that for any $1\leq k\leq n$, $\alpha_k=\sum_{j=1}^n Q_{jk}\alpha'_j$. To explain the final step of this calculation: we want to see why $\sum_i(\sum_j P_{ij}Q_{jk})\alpha_i=\alpha_k$. The reason is that for a fixed $1\leq i\leq n$, the sum $\sum_{j}P_{ij}Q_{jk}$ is $1$ if $i=k$ and $0$ otherwise. This is because this sum is precisely the $i$th row of $P$ multiplied by the $k$th column of $Q$, and thus this sum is the $(i,k)$ entry of $PQ$. But $PQ$ is the identity matrix, and so the $(i,k)$ entry is $1$ if $i=k$ and $0$ otherwise.

Related Question