Linear Algebra – Least Squares Solution to Underdetermined Lyapunov Equation

linear algebramatrix equations

I need to solve an underdetermined Lyapunov equation for unknown $n\times n$ matrix $X$.

$$AX + XA = B$$

The naive method is to vectorize $x=\operatorname{vec}(X)$ and use a least squares solver on the following equation to find the least-squares solution.

$$(I\otimes A + A\otimes I)x = \operatorname{vec}B$$

Experimentally I found that when $A$ and $B$ are positive semidefinite, I get the same solution by expressing $X$ in terms of $U,s$, the eigenvectors and eigenvalues of $A$ as below (proof) and modifying pointwise (Hadamard) division to skip division by zero, like how pseudo-inverse implementations do it.

$$X=U \left( \frac{U' BU}{s + s'} \right) U'$$

Does this appear in the literature, or does someone see a way to prove that this recovers least-squares solution?

Example

$$\text{A=}\left(
\begin{array}{ccc}
8 & -8 & -8 \\
-8 & 9 & 8 \\
-8 & 8 & 8 \\
\end{array}
\right)$$

$$\text{B=}\left(
\begin{array}{ccc}
5 & 5 & -5 \\
5 & 9 & -3 \\
-5 & -3 & 6 \\
\end{array}
\right)$$

This equation is underdetermined because $A$ and $B$ are singular, hence standard Lyapunov solver fails. However, both least squares and truncated spectral decomposition methods succeed with the same answer:

$$X=\frac{1}{640}\left(
\begin{array}{ccc}
1789 & 2928 & -1329 \\
2928 & 4672 & -1968 \\
-1329 & -1968 & 869 \\
\end{array}
\right)$$

Notebook

Best Answer

To be frank, this is obvious. Let $USU^\ast$ be a unitary diagonalisation of $A$ and write $S=\operatorname{diag}(s_1,s_2,\ldots,s_n)$. Let $C=U^\ast BU$ and $Y=U^\ast XU$. Then $$ \|AX+XA-B\|_F^2 =\|SY+YS-C\|_F^2 =\sum_{i,j}|(s_i+s_j)y_{ij}-c_{ij}\big|^2.\tag{1} $$ As each $|(s_i+s_j)y_{ij}-c_{ij}\big|$ contains a unique element of $Y$, minimising the above is equivalent to minimising each residual individually. When $s_i+s_j>0$, the minimum value of $|(s_i+s_j)y_{ij}-c_{ij}\big|$ is zero and it is attained by $y_{ij}=(s_i+s_j)^{-1}c_{ij}$. When $s_i+s_j=0$, the residual is equal to the constant $|c_{ij}|$. Hence $y_{ij}$ can assume any value and the least-norm least-squares solution is $y_{ij}=0$. So, in summary, the least-norm solution to the least-squares problem can be written as $$ y_{ij} =\begin{cases} \frac{c_{ij}}{s_i+s_j}&\text{if } s_i+s_j\ne0,\\ 0&\text{otherwise}, \end{cases} $$ i.e., $y_{ij}=(s_i+s_j)^+c_{ij}$ where $(s_i+s_j)^+$ denotes the Moore-Penrose pseudo-inverse of the number $s_i+s_j$. Since $\|Y\|_F=\|UYU^\ast\|_F$, the matrix $X=UYU^\ast$ constructed from $Y$ is also the least-norm solution to the least-squares problem of minimising $(1)$.

Related Question