Can off-diagonals be nonzero for covariance matrix after PCA

covariancestatisticsvariancevectors

I have some data for which I found the covariance matrix for:

$$\Sigma = \begin{bmatrix}3.33 & −1.00 & 3.33 & 33.00 \\ −1.00 & 1.58 & −1.92 & −13.92 \\ 3.33 & −1.92 & 62.92 & −23.42 \\ 33.00 & −13.92 & −23.42 & 398.92 \end{bmatrix}$$

After performing principal component analysis, I find the 4 PCs and transform my data set to the new coordinate system. However, finding a new covariance matrix on the PC-wise coordinate system data shows off-diagonals as nonzero:

$$\Sigma = \begin{bmatrix}408.92 & 0.42 & -3.92 & 0.00 \\ 0.42 & 60.25 & -0.42 & 0.00 \\ -3.92 & -0.42 & 0.92 & 0.00 \\ 0.00 & 0.00 & 0.00 & 0.00 \end{bmatrix}$$

From what I understand, the goal of PCA is to zero out joint variances. Is this possibly a rounding error? What should a covariance matrix look like after PCA?

Best Answer

I've concluded it's a rounding error. The off-diagonals should be zeroed after PCA confirmed by the below equality:

$$\Sigma - \lambda I = 0$$

Related Question