[Math] Why eigenvectors basis then transformation matrix is $\Lambda$

eigenvalues-eigenvectorslinear algebramatrices

Why is it that in a transformation, if the eigenvectors are chosen as the basis vectors, the transformation matrix $A$ would turn out to be that diagonal eigenvalues matrix $\Lambda$ and the transformation becomes $\Lambda \vec{x}=\vec{c}$ ?

What is the reason or cause that the transformation matrix could easily just be the eigenvalues matrix $\Lambda$?

I tried to prove it to show myself but somehow, it doesn't look very right.
Let $\vec{b_i}$ be the eigenvectors that form the basis for the vector $\vec{v}$ and the $c_n$ are combination to the eigenvectors…

$\vec{v} = c_1 \vec{b_1} + c_2 \vec{b_2} + … + c_n \vec{b_n}
$

$
T(\vec{v})= c_1 T(\vec{b_1}) + c_2 T(\vec{b_2}) + … + c_n T(\vec{b_n})
$

$
T(\vec{v})= c_1 A\vec{b_1} + c_2 A\vec{b_2} + … + c_n A\vec{b_n}
$

$
T(\vec{v})= c_1 \lambda_1 \vec{b_1} + c_2 \lambda_2\vec{b_2} + … + c_n \lambda_n\vec{b_n}
$

$
T(\vec{v})= \begin{bmatrix}
& & \\
\vec{b_1} & \vec{b_2} &…& \vec{b_n}\\
& &
\end{bmatrix}
\begin{bmatrix}
\lambda_1 & & & \\
& \lambda_2 & & \\
& & \ddots & \\
& & & \lambda_n
\end{bmatrix}
\begin{bmatrix}
c_1 & & & \\
& c_2 & & \\
& & \ddots & \\
& & & c_n
\end{bmatrix}
$

The last line is wrong but I'm trying to force out that $\Lambda$ which I couldn't.

Best Answer

The matrix of a linear transformation with respect to a basis has as columns the coordinates of the images of the basis elements under the transformation. If you have a basis of eigenvectors then the corresponding matrix will be diagonal because $A v_i = \lambda_i v_i$, that is, there is only one non-zero coordinate (more precisely, at most one, because $\lambda_i$ may be $0$).

Related Question