Solved – multivariate Gaussian/normal distribution- sigma covariance and eigenvector

distributionsmultivariate analysisnormal distribution

For a Multivariate Gaussian Distribution
The formula of Multivariate Normal Distribution
The geometric properties

I am a bit confused about the derivation of sigma
Derivation of sigma

The deduction of sigma from Christopher Bishop's book- Pattern recognition and machine learning

I am confused that I still cannot get the details of the derivation of sigma.

Could anyone help me with the process from having known $\Sigma u_i=\lambda_i u_i$ and $u_i $is orthonormal, to the result of $\Sigma=\sum^D_i \lambda_i u_i u_i^T$?

Best Answer

From $\Sigma u_i = \lambda_i u_i$, we can get by multiply $u_i^T$ on the right sides $$ \Sigma u_i u_i^T = \lambda_i u_iu_i^T, \quad i=1, \ldots, D. $$ Now, let us sum up $D$ equations above, which leads to $$\Sigma \sum_i u_i u_i^T = \sum_i \lambda_i u_iu_i^T.$$ Let $U=[u_1, \ldots, u_D]$, a column-binded matrix from $\{u_i\}_{i=1}^D$. If $u_i$'s are orthonormal (i.e. $\sum_i u_i u_i^T=U U^T = I$), we now conclude $$\sum_i \lambda_i u_iu_i^T = \Sigma \sum_i u_i u_i^T=\Sigma.$$

As whuber suggested, the eigenvectors should satisfy orthonormality, not just orthogonality. OP would be better to check this condition.

Hope this helps.

Related Question