[Math] Interpretation of Matrix Diagonalization

linear algebra

If $A$ be a non singular matrix then what good does it do by constructing another matrix, say $P$ whose columns are a basis that consists of eigenvectors of $A$? Does it have something to do with thee eigenvectors of $A$ being a set of basis for the transformation. Also, what is the significance of the diagonal matrix, say $\Lambda$?


Why is Diagonalization important?Even more so, what about orthogonal Diagonalization? What do they signify?

Could you please elaborate.
[Sorry for asking so many questions at once, I am fairly new to Linear Algebra.]

Any help is Much Appreciated!

Thank You!

Best Answer

Let me show you with an example how useful diagonalization can be.

There are many situations where one needs to calculate the exponential $e^A$ of a matrix $A$. This makes perfect sense as one takes $$ e^A=\sum_{k=0}^\infty\frac{A^k}{k!}. $$ Now, good luck calculating all the powers of $A$!

But if we know that $A$ is diagonalizable, then $A=PDP^{-1}$ with $D$ diagonal. Then $$ A^2=PDP^{-1}PDP^{-1}=PD^2P^{-1}, $$ and similarly $A^k=PD^kP^{-1}$ for all $k$. As $D$ is diagonal, its powers consist simply of the powers of its entries. So $e^D$ is the diagonal matrix with each diagonal entry the exponential of the corresponding entry of $D$, and $$ e^A=\sum_{k=0}^\infty\frac{A^k}{k!}=Pe^DP^{-1}. $$

A similar idea can be used to find the square root of a positive semidefinite metrix. And of course there are many other uses for diagonalization that might pop up in other answers.