Solved – PCA eigenvectors with dimensionality reduction

eigenvaluesmatrix decompositionpca

I want to understand how I can compute the eigenvectors and the eigenvalues of a matrix using dimensional reduction.I have a Matrix $M$ of dimensions $n$ x $d$ using dimension reduction I can compute the eigenvectors and the eigenvalues of the Covariance matrix $MM^t$. After computing these eigenvectors and eigenvalues how can I compute the eigenvectors of the original matrix?

Best Answer

You original matrix is rectangular, for which eigenvalues and eigenvectors are not defined. However, there are similar objects called singular values and singular vectors which are defined. They are related as follows:

  • The left-singular vectors of $M$ are eigenvectors of $MM^t$.
  • The right-singular vectors of $M$ are eigenvectors of $M^tM$.
  • The non-zero-singular values of M are the square roots of the non-zero eigenvalues of both $M^tM$ and $MM^t$.

So if your original matrix $M$ was square, the eigenvalues of $M$ are just the squares of the eigenvalues of $MM^t$ which you calculated.

You can find more details in Wikipedia's article on Singular Value Decomposition (SVD), which is closely related to PCA.