Proof for inverse matrix linear transformation

linear algebralinear-transformationsmatrices

There is a quite intuitive lemma which states that if $Ax = x$ then $A = \mathbf {I}_n$: $A$ is the identity matrix of order $n$. Even though it is quite obvious, I am not sure my proof is right.

I assume a linear transformation $T:\mathbb{R}^n \rightarrow \mathbb{R}^n$ such that $T(x) = x$. Then there must be a matrix $A$ such that

$$[T(x)]_E = A[x]_E,$$ where E is the Euclidean or canonic basis. Let $A = (a_{ij})$ for $ i=1, 2, \ldots, n$ and also $ j=1,2, \ldots n $. Let also $[x]_E
= (v_1, v_2 \ldots, v_n)'$
. It means that:

$$ x = \sum_{j=1}^{n} v_j e_j $$ where the $e_j \in E$, the canonic basis, which are such that, for $e_i$, $ (e_j) = 1$ if $i=j$; $ (e_j) = 0$ if $j \neq i$.

Also, the matrix product: $$ A[x]_E = \sum_{i=1}^{n} \sum_{j=1}^{n} a_{ij} v_j$$ gives us the coordinates for $Ax$.

Because we know that $Ax = x$, then

$$\sum_{i=1}^{n} \sum_{j=1}^{n} a_{ij} v_j = \sum_{j=1}^{n} v_j e_j ,$$ which leads us to a value of $a_{ij} = v_j$. Then $A_{n \times n} = (a_{ij})$ is a matrix with its diagonal values as $1$ and the rest as $0$ — thus $\mathbf{I}_n$.

Is my proof logical? It's very hard for me to prove intuitive statements.

Thanks! 🙂

Best Answer

You're right to apply it to the standard basis. Btw you need $Ax=x\,,\color{blue}{\forall x}$.

The $u_i$ sort of come out of nowhere.

The point is that $Ae_i=e_i$ is the $i$-th column of $A$.

Related Question