[Math] In diagonalization, can the eigenvector matrix be any scalar multiple

linear algebra

One can decompose a diagonalizable matrix (A) into A = C D C^−1, where C contains the eigenvectors and D is a diagonal matrix with the eigenvalues in the diagonal positions. So here's where I get confused. If I start with a random eigenvector matrix D

> D
     [,1] [,2]
[1,]    7    0
[2,]    0    5

and a random eigenvector matrix C

> C
     [,1] [,2]
[1,]    4    2
[2,]    1    3

There should be some matrix A with those eigenvectors and eigenvalues. When I compute A by multiplying C D C^-1 I get

> A<-C%*%D%*%solve(C)
> A
     [,1] [,2]
[1,]  7.4 -1.6
[2,]  0.6  4.6

My understanding is that if I then work backwards and diagonalize A I should get the same matrix C and D, I started with to get A in the first place. But for reasons that escape me, I don't.

> eigen(A)
$values
[1] 7 5

$vectors
      [,1]      [,2]
[1,] 0.9701425 0.5547002
[2,] 0.2425356 0.8320503 

the first eigenvector is a multiple of column 1 of C and the second eigenvector is a multiple of column 2 of C. For some reason it feels strange that I have this relationship:

xC D (xC)^-1 = C D C where x is a scalar. Did I screw up somewhere or is this true?

Best Answer

It's true. In general, if $a$ is a nonzero scalar and $M$ is an invertible matrix, then $(aM)^{-1} = a^{-1}M^{-1}$. So $xCD(xC)^{-1} = xCDx^{-1}C^{-1} = xx^{-1}CDC^{-1} = CDC^{-1}$. Note that the second equality in this chain is only true because $x$ is a scalar (rather than a matrix).