[Math] When does the inverse of a covariance matrix exist

covariancelinear algebramatricespositive definiteprobability theory

We know that a square matrix is a covariance matrix of some random vector if and only if it is symmetric and positive semi-definite (see Covariance matrix). We also know that every symmetric positive definite matrix is invertible (see Positive definite). It seems that the inverse of a covariance matrix sometimes does not exist.

Does the inverse of a covariance matrix exist if and only if the covariance matrix is positive definite? How can I intuitively understand the situation when the inverse of a covariance matrix does not exist (does it mean that some of the random variables of the random vector are equal to a constant almost surely)?

Any help will be much appreciated!

Best Answer

If the covariance matrix is not positive definite, we have some $a \in \mathbf R^n \setminus \{0\}$ with $\def\C{\mathop{\rm Cov}}\C(X)a = 0$. Hence \begin{align*} 0 &= a^t \C(X)a\\ &= \sum_{ij} a_j \C(X_i, X_j) a_i\\ &= \mathop{\rm Var}\left(\sum_i a_i X_i\right) \end{align*} So there is some linear combination of the $X_i$ which has zero variance and hence is constant, say equal to $\alpha$, almost surely. Letting $H := \{x \in \mathbf{R}^n: \sum_{i} a_i x_i = \alpha\}$, this means, as @drhab wrote $\mathbf P(X \in H) = 1$ for the hyperplane $H$.