Change of Coordinates and property of invertible matrices (Sec 2.4 Theorem 8, Hoffman Kunze, Linear Algebra)

change-of-basislinear algebramatrices

I was reading Linear Algebra by Hoffman Kunze, and encountered this in the Theorem 8 of the Chapter Coordinates, the theorem is stated below :

coo

What I get about is in the most bottom line (uniqueness), it says

$"… it\: is\: clear\: that $

$$ \alpha'_{i}=\sum_{i=1}^{n} P_{ij}\alpha_{i}." \tag{a}$$

Is there any easy way to see why it is clear? I tried many ways and what I found was that we start from scratch, (starting the same with the proof, with $\scr \overline{B}$ and then we find an invertible matrix, say $Q$, which we don't know equal to $P$ or not, such that the property (a) above holds with $Q_{ij}$ instead of $P_{ij}$. Then now what we are left to show is that $P$ = $Q$, and we currently have:

$$ x_{i}=\sum_{j=1}^{n} P_{ij} x'_{j}\tag{from (i)}$$

and,

$$ x_{i}=\sum_{j=1}^{n} Q_{ij} x'_{j}$$

So together,

$$ \sum_{j=1}^{n} Q_{ij} x'_{j}=\sum_{j=1}^{n} P_{ij} x'_{j}$$

$$ \sum_{j=1}^{n} (Q_{ij} – P_{ij}) x'_{j}= 0$$

The way I showed that this implies that $P$ = $Q$ is by asserting that $P – Q \neq 0^{n\times n}$ and find a contradiction. Suppose $A := P – Q \neq 0^{n\times n}$ then we can choose a row $r$ such that the k-th entry is non-zero, we can plug in $0$ to any other entries other then the k-th and we are left with something non zero equals to $0$ which is a contradiction, so that $A = 0^{n\times n}$ and $P = Q$, and since $(a)$ holds for $Q$ and $P = Q$, it follows that $(a)$ holds for $P$.

I'm pretty sure that the proof is correct since we can plug in various values to $x'_{1}, … , x'_{n} \in F$, since $F$ is a field, it sure contains $0$ and $1$, but this proof seems to be lengthy and is not as clear as how it was written to be by Hoffman and Kunze, I think I'm missing something here, and will be very thankful for a good explanation. Thanks!

Best Answer

Representing $\alpha_j'$ in the ordered basis $(\alpha_1',\dots,\alpha_n')$ will just give you the standard basis vector $e_j$ because $\alpha_j' = 0\alpha_1' + \dots + 1\alpha_j' + \dots + 0 \alpha_n'$. Multiplying $P$ by $e_j$ gives you the $j$-th column of $P$. So equation (i) says that

$$ [\alpha_j']_{\mathcal B} = \begin{bmatrix} P_{1,j} \\ \vdots \\ P_{n,j} \end{bmatrix}. $$

By definition of $[\alpha_j']_{\mathcal B}$, this means that $$ \alpha_j' = P_{1,j}\alpha_1 + \dots + P_{n,j}\alpha_n = \sum_{i = 1}^n P_{i,j} \alpha_i. $$