[Math] Spectral Decomposition for matrices with eigenvalue(s) 0

linear algebra

When expressing the spectral decomposition of a matrix that has eigenvalues of 0, do you express the corresponding matrices with factor '0', or leave them out altogether?

EDIT: let's consider some matrix M, where
$$
M =
\begin{bmatrix}
1 & 1 & 1 \\
1 & 1 & 1 \\
1 & 1 & 1 \\
\end{bmatrix}
$$
Then, would the spectral decomposition of this matrix M be

(i)
$$
\begin{bmatrix}
1 & 1 & 1 \\
1 & 1 & 1 \\
1 & 1 & 1 \\
\end{bmatrix}
=
3
\begin{bmatrix}
\frac{1}{3} & \frac{1}{3} & \frac{1}{3} \\
\frac{1}{3} & \frac{1}{3} & \frac{1}{3} \\
\frac{1}{3} & \frac{1}{3} & \frac{1}{3} \\
\end{bmatrix}
+
0
\begin{bmatrix}
\frac{1}{2} & -\frac{1}{2} & 0 \\
-\frac{1}{2} & \frac{1}{2} & 0 \\
0 & 0 & 0 \\
\end{bmatrix}
+
0
\begin{bmatrix}
\frac{1}{6} & \frac{1}{6} & -\frac{1}{3} \\
\frac{1}{6} & \frac{1}{6} & -\frac{1}{3} \\
-\frac{1}{3} & -\frac{1}{3} & \frac{2}{3} \\
\end{bmatrix}
$$
or (ii)
$$
\begin{bmatrix}
1 & 1 & 1 \\
1 & 1 & 1 \\
1 & 1 & 1 \\
\end{bmatrix}
=
3
\begin{bmatrix}
\frac{1}{3} & \frac{1}{3} & \frac{1}{3} \\
\frac{1}{3} & \frac{1}{3} & \frac{1}{3} \\
\frac{1}{3} & \frac{1}{3} & \frac{1}{3} \\
\end{bmatrix}
$$
I hope this clarifies the problem.

EDIT 2: Maybe the terminology in my textbook is incorrect, but the form I am talking about looks like this: if the matrix M was orthogonally diagonalized by
$$
P =
\begin{bmatrix}
\bar{u_1} & \bar{u_2} & \bar{u_3} \\
\end{bmatrix}
$$
and
$$
D =
\begin{bmatrix}
\lambda_1 & 0 & 0 \\
0 & \lambda_2 & 0 \\
0 & 0 & \lambda_3 \\
\end{bmatrix}
$$
where $\bar{u_1}$, $\bar{u_2}$, and $\bar{u_3}$ are unit eigenvectors of M, then M could be rewritten as
$$M = \lambda_1\bar{u_1}\bar{u_1}^T + \lambda_2\bar{u_2}\bar{u_2}^T + \lambda_3\bar{u_3}\bar{u_3}^T$$

If the answerer happens to know the name of this form, I'd be much obliged.

Best Answer

What your book calls spectral decomposition is just the equation you obtain after carrying out the matrix product in what is usually called a spectral decomposition. This seems very odd to me, because usually the term decomposition is used to refer as a way to write a matrix as a product of other matrices, not as a sum.

Anyway, a spectral decomposition of a matrix $M$ is an equation of the form $$ M = Q \Delta Q^{-1} \label{eq:1} \tag{1} $$ where $\Delta$ is the diagonal matrix with diagonal elements $\lambda_1,\dotsc,\lambda_n$, the eigenvalues of $M$ counted with their respective multiplicity, and $Q$ is a matrix whose columns $q_1,\dotsc,q_n$ form a basis of eigenvectors of $M$ such that $q_i$ corresponds to $\lambda_i$.

Now, if you choose an orthonormal basis of eigenvectors, then $Q$ is an orthogonal matrix and $Q^{-1} = Q^T$. Thus $\eqref{eq:1}$ becomes $$ \begin{align} M &= \left( \begin{array}{c | c | c} q_1 & \dotsc & q_n \end{array} \right) \begin{pmatrix} \lambda_1 & & \\ & \ddots & \\ & & \lambda_n \end{pmatrix} \left( \begin{array}{c} q_1^T \\ \hline \vdots \\ \hline q_n^T \end{array} \right) \\ &= \lambda_1 q_1 q_1^T + \dotsc + \lambda_n q_n q_n^T \end{align} $$


TL;DR: No, you don't need to include the summands with eigenvalue $0$.

Related Question