Solved – Eigenvectors of a covariance matrix with only positive elements

mathematical-statisticspcaself-study

If all the elements of a positive-definite covariance matrix are positive, how can I prove that the coefficients [elements] of the first principal component [first eigenvector] are all of the same sign, and the coefficients [elements] of all other principal components [eigenvectors] cannot all be of the same sign?

This is Problem 11.11 from "Multivariate Statistical Analysis" by Anderson, 3rd ed. I have tried to prove it for several days…

Best Answer

With your new information, that all the components of the positive-definite matrix are positive, it becomes easy. While it follows directly from the Perron-Frobenius theorem (which is valid for square matrices with non-negative elements, symmetric or not), in the symmetric case it is much easier.

Let the positive-definite matrix be $S$. The eigenvector corresponding to the largest eigenvector is the vector $x$ obtaining the maximum in the following problem: $$ \lambda_{\mathrm{max}} = \mathrm{max}_{\{x \colon \| x\|=1\}} x^T S x $$(that is, the "argmax") where $\lambda_{\text{max}}$ is the largest eigenvalue.

Suppose to get a contradiction that $x_1$ is negative, while the other components of $x$ are non-negative. We can write $$ x^T S x = x_1 S_{11} x_1+2x_1 \sum_{j=2}^m s_{1j} x_j + \sum_{i=2}^m \sum_{j=2}^m x_i s_{ij} x_j $$ Note that the first and third terms are positive while the second term is negative, and we can get a strictly larger value by switching the sign of $x_1$, which respects the restriction on norm. That gives the contradiction you need. A similar argument can be written for any other pattern of negative/positive sign.