Finding Orthonormal Eigenvectors of a $4\times4$ Variance-Covariance Matrix

eigenvalues-eigenvectorslinear algebraprobability theory

Let $U_1$ and $U_2$ be two independent uniform random variables on $[0,1]$. Suppose that $X= (X_1,X_2,X_3,X_4)′$ where $X_1=U_1$, $X_2=U_2$, $X_3=U_1+U_2$ and $X_4=U_1−U_2$.

(a) Compute the variance covariance matrix $\Sigma$ of $X$.

(b) Verify that $v_1=\frac{1}{\sqrt{3}}(0,1,1,−1)$, $v_2=\frac{1}{\sqrt{3}}(1,0,1,1)$, $v_3=\frac{1}{\sqrt{6}}(0,2,−1,1)$, and $v_4=\frac{1}{\sqrt{6}}(2,0,−1,−1)$ are orthonormal eigenvectors of $\Sigma$. Find out the corresponding eigenvalues.

We have that $$\Sigma=\begin{bmatrix}\frac{1}{12}&0&\frac{1}{12}&\frac{1}{12}\\0&\frac{1}{12}&\frac{1}{12}&-\frac{1}{12}\\ \frac{1}{12}&\frac{1}{12}&\frac{1}{6}&0\\ \frac{1}{12}&-\frac{1}{12}&0&\frac{1}{6}\end{bmatrix}$$

since $\mathsf{Var}(X_1)=\mathsf{Var}(X_2)=\mathsf{Var}(U_1)=\frac{1}{12}$, $\mathsf{Var}(X_3)=\mathsf{Var}(U_1+U_2)=\frac{1}{6}$, $\mathsf{Cov}(X_1,X_2)=0$, $\mathsf{Cov}(X_1,X_3)=\mathsf{Cov}(U_1,U_1+U_2)=\mathsf{Cov}(U_1,U_1)+\mathsf{Cov}(U_1,U_2)=\mathsf{Var}(U_1)=\frac{1}{12}$, $\mathsf{Cov}(X_1,X_4)=\mathsf{Cov}(U_1,U_1-U_2)=\mathsf{Cov}(U_1,U_1)-\mathsf{Cov}(U_1,U_2)=\mathsf{Var}(U_1)=\frac{1}{12}$, $\mathsf{Cov}(X_2,X_3)=\mathsf{Cov}(U_2,U_1+U_2)=\mathsf{Cov}(U_1,U_2)+\mathsf{Cov}(U_2,U_2)=\mathsf{Var}(U_2)=\frac{1}{12}$, $\mathsf{Cov}(X_2,X_4)=\mathsf{Cov}(U_2,U_1-U_2)=\mathsf{Cov}(U_1,U_2)-\mathsf{Cov}(U_2,U_2)=-\mathsf{Var}(U_2)=-\frac{1}{12}$, and $\mathsf{Cov}(X_3,X_4)=\mathsf{Cov}(U_1+U_2,U_1-U_2)=\mathsf{Cov}(U_1,U_1)-\mathsf{Cov}(U_1,U_2)+\mathsf{Cov}(U_1,U_2)-\mathsf{Cov}(U_2,U_2)=\frac{1}{12}-\frac{1}{12}=0$

Software then gives that

$$\begin{align*}
\det(\Sigma-\lambda I)
&=\det\left(\begin{bmatrix}\frac{1}{12}-\lambda&0&\frac{1}{12}&\frac{1}{12}\\0&\frac{1}{12}-\lambda&\frac{1}{12}&-\frac{1}{12}\\ \frac{1}{12}&\frac{1}{12}&\frac{1}{6}-\lambda&0\\ \frac{1}{12}&-\frac{1}{12}&0&\frac{1}{6}-\lambda\end{bmatrix}\right)\\\\
&=\det\left(\begin{pmatrix}\frac{1}{12}-\lambda&0&\frac{1}{12}&\frac{1}{12}\\ 0&\frac{1}{12}-\lambda&\frac{1}{12}&-\frac{1}{12}\\ 0&0&\frac{3\lambda\left(4\lambda-1\right)}{1-12\lambda}&0\\ 0&0&0&\frac{3\lambda\left(4\lambda-1\right)}{1-12\lambda}\end{pmatrix}\right)\\\\
&=\frac{\lambda^2\left(1-4\lambda\right)^2}{16}
\end{align*}$$

Hence $\lambda_1=\frac{1}{4}$ and $\lambda_2=0$

We have that for $\lambda_2=0$,

$$\begin{bmatrix}\frac{1}{12}&0&\frac{1}{12}&\frac{1}{12}\\0&\frac{1}{12}&\frac{1}{12}&-\frac{1}{12}\\ \frac{1}{12}&\frac{1}{12}&\frac{1}{6}&0\\ \frac{1}{12}&-\frac{1}{12}&0&\frac{1}{6}\end{bmatrix}\begin{bmatrix}x_1\\x_2\\x_3\\x_4\end{bmatrix}=\begin{bmatrix}0\\0\\0\\0\end{bmatrix}$$

means $x_2=-x_3+x_4$ and $x_1=-x_3-x_4$. Letting $x_3=x_4=1$ we have that $x_2=0$ and $x_1=-2$

Hence the associated eigenvector is $\frac{1}{\sqrt{6}}(-2,0,1,1)$ which is equivalent to $v_4$ which was provided.

Similarly, we find that for $\lambda_1=\frac{1}{4}$ we get an associated eigenvector of $\frac{1}{\sqrt{3}}(1,0,1,1)$ which is $v_2$.

Where do $\boldsymbol{v_1}$ and $\boldsymbol{v_3}$ come from?

Best Answer

Although finding eigenvectors and eigenvalues of this covariance matrix for yourself is good practice, it’s not really necessary to do so here since you’ve been given a set of vectors and asked to verify that they are indeed eigenvectors of the matrix. This is a straightforward matter of multiplying each one by the covariance matrix. For example, $$\Sigma \begin{bmatrix}0\\1\\1\\-1\end{bmatrix} = \begin{bmatrix}0\\\frac14\\\frac14\\-\frac14\end{bmatrix},$$ so $(0,1,1,-1)^T$ is an eigenvector with eigenvalue $\frac14$. As any nonzero scalar multiple of an eigenvector is also an eigenvector, this verifies that $v_1$ is an eigenvector of $\Sigma$ with eigenvalue $\frac14$. Each of the given vectors obviously has unit length, and it’s fairly easy to verify by inspection that they are pairwise orthogonal, so they do indeed form an orthonormal set.

As for your question about finding another pair of eigenvectors, note that both eigenvalues that you computed from the characteristic equation have algebraic multiplicity $2$. Every real symmetric matrix is diagonalizable, so their geometric multiplicity is also $2$, i.e., the corresponding eigenspaces are two-dimensional. Using the usual method of Gaussian elimination, $\Sigma$ reduces to $$\begin{bmatrix}1&0&1&1\\0&1&1&-1\\0&0&0&0\\0&0&0&0\end{bmatrix},$$ from which we can read that the eigenspace of $0$ is spanned by $w_1=(1,1,-1,0)^T$ and $w_2=(1,-1,0,-1)^T$. Every linear combination of these two vectors is an eigenvector of $0$. In particular, $v_3=\frac1{\sqrt6}(w_1-w_2)$ and $v_4=\frac1{\sqrt6}(w_1+w_2)$. Similarly, $\Sigma-\frac14I$ reduces to $$\begin{bmatrix}1&0&-\frac12&-\frac12\\0&1&-\frac12&\frac12\\0&0&0&0\\0&0&0&0\end{bmatrix},$$ so the eigenspace of $\frac14$ is spanned by $(1,1,2,0)^T$ and $(1,-1,0,2)^T$. I’ll leave finding the linear combinations of these two vectors that produce $v_1$ and $v_2$ to you.

Related Question