[Math] Relation between eigenvectors of covariance matrix and right Singular vectors of SVD, Diagonal matrix

eigenvalues-eigenvectorslinear algebrasvd

I have a $m \times n$ data matrix $X$, ($m$ instances and $n$ features) on which I calculate the Covariance matrix $C$ and perform eigenvalue decomposition. so $C=W \Sigma W'$ where $W$ are the eigenvectors and $\Sigma$ are the eigenvalues arranged in diagonal matrix.

Next, I performed SVD (Singular Value Decomposition) of $X$ , so $X=U \Sigma V'$. Now I noticed a strange thing, the eigenvectors $W$ and the right singular vectors $V$ are equal (at least in magnitude), there were some differences from 4th decimal point in the values (I used Matlab), but I guess that could just be numerical error. Every odd vector in $W$ and $V$ were showing opposite signs, while every even vector of $W$ and $V$ shows same sign !!

So why does this happen ?? ie I know that EVD and SVD are somehow connected, from the formulation, it looks like SVD is generalised form of EVD but I am not sure about that. But why does $V$ vectors and $W$ vectors equal in magnitude ? what does this mean intuitively ? I have used PCA and SVD and have seen the term loading matrix used for both $W$ and $V$, so I knew that should be related, But can someone explain this ??

And what is the significance of left singular vectors ? are they useful anywhere ?

Finally, I have a question about Diagonal matrices. Wikipedia in its page on orthogonal matrices says that a matrix Q is orthogonal if its transpose is equal to its inverse: or $Q' = Q^{-1}$ or $QQ' = Q'Q = I$. But for diagonal matrices the first equation does not hold true, but the second equation holds true, (this is because the inverse of diagonal matrix is 1/main diagonal elements). So does that mean a diagonal matrix is not orthogonal ? Can someone clarify this difference ?

Best Answer

First, the difference in the eigenvectors. Let $(\lambda,v)$ be an eigenpair of $A$, i.e., $A v = \lambda v$ and let $\alpha \in \mathbb{C} \setminus \{0\}$. Then

$$A (\alpha v) = \alpha A v = \alpha \lambda v = \lambda (\alpha v).$$

So, $v$ is an eigenvector of $A$ if and only if $\alpha v$ is an eigenvector of $A$. Both are equally "good", unless you desire some additional properties. Note that this works for any $A$, not just $A = C$.

Second, the significance of the left singular vectors is in computing the eigenvalue decomposition in $XX^T$ (in your notation: $X^T = X'$).

Third, a real diagonal matrix is orthogonal if and only if each of its diagonal elements is either $1$ or $-1$. Let us prove this.

Let $D = \mathop{\rm diag}(d_1,\dots,d_n)$. Obviously, $D = D^T$, so

$$D^TD = \mathop{\rm diag}(d_1^2,\dots,d_n^2).$$

So, $D^TD = {\rm I}$ if and only if $d_k^2 = 1$ for all $k$.

For complex matrices (and using complex adjoint $Q^*$ instead of transpose $Q^T$), we get that $|d_k| = 1$ for all $k$.