Linear Algebra – Visualization of Singular Value Decomposition of a Symmetric Matrix

eigenvalues-eigenvectors

The Singular Value Decomposition of a matrix A satisfies

$\mathbf A = \mathbf U \mathbf \Sigma \mathbf V^\top$
The visualization of it would look like
enter image description here

But when $\mathbf A$ is symmetric we can do:

$\begin{align*}
\mathbf A\mathbf A^\top&=(\mathbf U\mathbf \Sigma\mathbf V^\top)(\mathbf U\mathbf \Sigma\mathbf V^\top)^\top\\
\mathbf A\mathbf A^\top&=(\mathbf U\mathbf \Sigma\mathbf V^\top)(\mathbf V\mathbf \Sigma\mathbf U^\top)
\end{align*}$

and since $\mathbf V$ is an orthogonal matrix ($\mathbf V^\top \mathbf V=\mathbf I$), so we have:

$\mathbf A\mathbf A^\top=\mathbf U\mathbf \Sigma^2 \mathbf U^\top$

I have two questions:

  • Is the above statement correct? when Matrix $\mathbf A$ is symmetric and we compute SVD we would get $\mathbf U\mathbf \Sigma^2 \mathbf U^\top$

  • How would the decomposition looks like in a symmetric matrix? As we are getting the eigenvectors and squared eigenvalues in matrices $\mathbf U $ and $\mathbf \Sigma$

Best Answer

Singular value decomposition

Start with a matrix with $m$ rows, $n$ columns, and rank $\rho$, $$ \mathbf{A}\in\mathbb{C}^{m\times n}_{\rho} $$ which has the singular value decomposition $$ \mathbf{A} = \mathbf{U} \, \Sigma \, \mathbf{V}^{*} = % \left[ \begin{array}{cc} \color{blue}{\mathbf{U}_{\mathcal{R}\left(\mathbf{A}\right)}} & \color{red} {\mathbf{U}_{\mathcal{N}\left(\mathbf{A}^{*}\right)}} \end{array} \right] % \left[ \begin{array}{cc} \mathbf{S} & \mathbf{0} \\ \mathbf{0} & \mathbf{0} \end{array} \right] % \left[ \begin{array}{cc} \color{blue}{\mathbf{V}_{\mathcal{R}\left(\mathbf{A}^{*}\right)}} & \color{red} {\mathbf{V}_{\mathcal{N}\left(\mathbf{A}\right)}} \end{array} \right]^{*} % $$ where the color denotes $\color{blue}{range}$ spaces and $\color{red}{null}$ spaces. The dimensions of the domain matrices are $$ % \color{blue}{\mathbf{U}_{\mathcal{R}\left(\mathbf{A}\right)}} \in \mathbb{C}^{m\times \rho}, \quad % \color{red}{\mathbf{U}_{\mathcal{N}\left(\mathbf{A}^{*}\right)}} \in \mathbb{C}^{m \times m - \rho}, \quad % \color{blue}{\mathbf{V}_{\mathcal{R}\left(\mathbf{A}^{*}\right)}} \in \mathbb{C}^{n\times \rho}, \quad % \color{red}{\mathbf{V}_{\mathcal{N}\left(\mathbf{A}\right)}} \in \mathbb{C}^{n\times n - \rho}. $$ The domain matrices are unitary: $$ \begin{align} \mathbf{U}\mathbf{U}^{*} &= \mathbf{U}^{*}\mathbf{U} = \mathbf{I}_{m} \\ \mathbf{V}\mathbf{V}^{*} &= \mathbf{V}^{*}\mathbf{V} = \mathbf{I}_{n} \end{align} $$

The dimensions of the singular value matrices are $$ % \Sigma \in \mathbb{R}^{m\times n}, \quad % \mathbf{S} \in \mathbb{R}^{\rho\times \rho}. $$

The hermitian conjugate is constructed according to $$ \mathbf{A}^{*} = \mathbf{V} \, \Sigma^{\mathrm{T}} \, \mathbf{U}^{*} = % \left[ \begin{array}{cc} \color{blue}{\mathbf{V}_{\mathcal{R}\left(\mathbf{A}^{*}\right)}} & \color{red} {\mathbf{V}_{\mathcal{N}\left(\mathbf{A}\right)}} \end{array} \right] % \left[ \begin{array}{cc} \mathbf{S} & \mathbf{0} \\ \mathbf{0} & \mathbf{0} \end{array} \right] % \left[ \begin{array}{cc} \color{blue}{\mathbf{U}_{\mathcal{R}\left(\mathbf{A}\right)}} & \color{red} {\mathbf{U}_{\mathcal{N}\left(\mathbf{A}^{*}\right)}} \end{array} \right]^{*} % $$ where $\Sigma^{\mathrm{T}}\in \mathbb{R}^{n\times m}$.

The Moore-Penrose pseudoinverse is constructed according to $$ \mathbf{A}^{\dagger} = \mathbf{V} \, \Sigma^{\dagger} \, \mathbf{U}^{*} = % \left[ \begin{array}{cc} \color{blue}{\mathbf{V}_{\mathcal{R}\left(\mathbf{A}^{*}\right)}} & \color{red} {\mathbf{V}_{\mathcal{N}\left(\mathbf{A}\right)}} \end{array} \right] % \left[ \begin{array}{cc} \mathbf{S}^{-1} & \mathbf{0} \\ \mathbf{0} & \mathbf{0} \end{array} \right] % \left[ \begin{array}{cc} \color{blue}{\mathbf{U}_{\mathcal{R}\left(\mathbf{A}\right)}} & \color{red} {\mathbf{U}_{\mathcal{N}\left(\mathbf{A}^{*}\right)}} \end{array} \right]^{*} % $$ where $\Sigma^{\dagger}\in \mathbb{R}^{n\times m}$.

The product matrix rules you stated always hold: $$ \begin{align} % \mathbf{A} \mathbf{A}^{*} &= % \left( \mathbf{U} \, \mathbf{\Sigma} \, \mathbf{V}^{*} \right) % \left( \mathbf{U} \, \mathbf{\Sigma} \, \mathbf{V}^{*} \right)^{*} % = % \left( \mathbf{U} \, \mathbf{\Sigma} \, \mathbf{V}^{*} \right) % \left( \mathbf{V} \, \mathbf{\Sigma}^{\mathrm{T}} \, \mathbf{V}^{*} \right) \\ % \mathbf{A}^{*} \mathbf{A} &= % \left( \mathbf{U} \, \mathbf{\Sigma} \, \mathbf{V}^{*} \right)^{*} % \left( \mathbf{U} \, \mathbf{\Sigma} \, \mathbf{V}^{*} \right) % = % \left( \mathbf{V} \, \mathbf{\Sigma}^{\mathrm{T}} \, \mathbf{V}^{*} \right) % \left( \mathbf{U} \, \mathbf{\Sigma} \, \mathbf{V}^{*} \right) \\ % \end{align} $$ Examples follow.

Square, full rank $m = n = \rho$

$$ \mathbf{A} = \mathbf{U} \, \Sigma \, \mathbf{V}^{*} = % \left[ \begin{array}{c} \color{blue}{\mathbf{U}_{\mathcal{R}\left(\mathbf{A}\right)}} \end{array} \right] % \left[ \mathbf{S} \right] % \left[ \begin{array}{c} \color{blue}{\mathbf{V}_{\mathcal{R}\left(\mathbf{A}^{*}\right)}} \end{array} \right]^{*} % $$ The product matrices are $$ \begin{align} % \mathbf{A}^{*}\mathbf{A} &= \color{blue}{\mathbf{V}_{\mathcal{R}\left(\mathbf{A}^{*}\right)}} \, \mathbf{S}^{2} \, \color{blue}{\mathbf{V}_{\mathcal{R}\left(\mathbf{A}^{*}\right)}}^{*} \\ % \mathbf{A}\mathbf{A}^{*} &= \color{blue}{\mathbf{U}_{\mathcal{R}\left(\mathbf{A}^{*}\right)}} \, \mathbf{S}^{2} \, \color{blue}{\mathbf{U}_{\mathcal{R}\left(\mathbf{A}^{*}\right)}}^{*} % \end{align} $$

Tall, full column rank $n = \rho$, $m \ge n$

$$ \mathbf{A} = \mathbf{U} \, \Sigma \, \mathbf{V}^{*} = % \left[ \begin{array}{cc} \color{blue}{\mathbf{U}_{\mathcal{R}\left(\mathbf{A}\right)}} & \color{red} {\mathbf{U}_{\mathcal{N}\left(\mathbf{A}^{*}\right)}} \end{array} \right] % \left[ \begin{array}{c} \mathbf{S} \\ \mathbf{0} \end{array} \right] % \left[ \begin{array}{cc} \color{blue}{\mathbf{V}_{\mathcal{R}\left(\mathbf{A}^{*}\right)}} & \color{red} {\mathbf{V}_{\mathcal{N}\left(\mathbf{A}\right)}} \end{array} \right]^{*} % $$ The product matrices are $$ \begin{align} % \mathbf{A}^{*}\mathbf{A} &= \color{blue}{\mathbf{V}_{\mathcal{R}\left(\mathbf{A}^{*}\right)}} \, \mathbf{S}^{2} \, \color{blue}{\mathbf{V}_{\mathcal{R}\left(\mathbf{A}^{*}\right)}}^{*} \\ % \mathbf{A}\mathbf{A}^{*} &= % \left[ \begin{array}{cc} \color{blue}{\mathbf{U}_{\mathcal{R}\left(\mathbf{A}\right)}} & \color{red} {\mathbf{U}_{\mathcal{N}\left(\mathbf{A}^{*}\right)}} \end{array} \right] % \left[ \begin{array}{cc} \mathbf{S}^{2} & \mathbf{0} \\ \mathbf{0} & \mathbf{0} \end{array} \right] % \left[ \begin{array}{cc} \color{blue}{\mathbf{U}_{\mathcal{R}\left(\mathbf{A}\right)}} & \color{red} {\mathbf{U}_{\mathcal{N}\left(\mathbf{A}^{*}\right)}} \end{array} \right]^{*} % \end{align} $$

Wide, full row rank $m = \rho$, $n \ge m$

$$ \mathbf{A} = \mathbf{U} \, \Sigma \, \mathbf{V}^{*} = % \left[ \begin{array}{c} \color{blue}{\mathbf{U}_{\mathcal{R}\left(\mathbf{A}\right)}} \end{array} \right] % \left[ \begin{array}{cc} \mathbf{S} & \mathbf{0} \end{array} \right] % \left[ \begin{array}{cc} \color{blue}{\mathbf{V}_{\mathcal{R}\left(\mathbf{A}^{*}\right)}} & \color{red} {\mathbf{V}_{\mathcal{N}\left(\mathbf{A}\right)}} \end{array} \right]^{*} % $$ The product matrices are $$ \begin{align} % \mathbf{A}^{*}\mathbf{A} &= % \left[ \begin{array}{cc} \color{blue}{\mathbf{V}_{\mathcal{R}\left(\mathbf{A}^{*}\right)}} & \color{red} {\mathbf{V}_{\mathcal{N}\left(\mathbf{A}\right)}} \end{array} \right] % \left[ \begin{array}{cc} \mathbf{S}^{2} & \mathbf{0} \\ \mathbf{0} & \mathbf{0} \end{array} \right] % \left[ \begin{array}{cc} \color{blue}{\mathbf{V}_{\mathcal{R}\left(\mathbf{A}^{*}\right)}} & \color{red} {\mathbf{V}_{\mathcal{N}\left(\mathbf{A}\right)}} \end{array} \right]^{*} \\ % \mathbf{A}\mathbf{A}^{*} &= % \color{blue}{\mathbf{U}_{\mathcal{R}\left(\mathbf{A}\right)}} \, \, \mathbf{S}^{2} \, \color{blue}{\mathbf{U}_{\mathcal{R}\left(\mathbf{A}\right)}} \\ % \end{align} $$

For the hermitian matrix, $$ \begin{align} \mathbf{A} &= \mathbf{A}^{*} \\ \mathbf{U} \, \Sigma \, \mathbf{V}^{*} &= \mathbf{V} \, \Sigma \, \mathbf{U}^{*} \end{align} $$ because in this case $\Sigma = \Sigma^{\mathrm{T}}$.