Linear Algebra – Is There a Unique Solution for This Quadratic Matrix Equation?

linear algebramatricesmatrix equations

Here is the quadratic matrix equation I've been looking at lately:

$$ Q_{r,r}=A_{r,r}X_{r,r}^2+B_{r,r}X_{r,r}+C_{r,r}=0_{r,r} $$

Note that $A, B, C,$ and $X$ are $r \times r$ matrices. $A$ contains known elements, $B$ contains known elements, $C$ contains known elements, and $X$ contains the unknown elements that you are solving for. $ 0_{r,r} $ is just the $r \times r$ null matrix.

Is there any solution for $X$ in terms of $A, B,$ and $C$ (making no easy assumptions)? (e.g. $X$ is a diagonal matrix, $A=B=C$, or anything of that sort.)

I have tried to solve this and nothing has worked out. I attempted solving it generally by manipulating the matrices in variable form (i.e. actually writing out the matrices $A, B, C,$ and $X$ in variables) and finding a unique solution for all of the elements of $X$ in terms of the elements of $A, B,$ and $C$. That didn't work out beyond the case of $r=1$.

Trying to solve it by looking at $r$ at different values did not work out either; I ended up with very abysmal equations at just $r=2$. I don't know exactly how to make this appealing to the denizens of math.stackexchange, but it (as far as I know) isn't a heavily studied problem.

There is a very high possibility that I've just been doing elementary techniques and nothing of note, so I hope someone or a group of people could shed light on this.

Best Answer

You could note that the matrix similarity \begin{align} \pmatrix{I & \mathbf{0} \\ X & -I}\pmatrix{\mathbf{0} & I \\ -C & -B} & \pmatrix{I & \mathbf{0} \\ X & -I} & \\ =\pmatrix{\mathbf{0} & I \\ C & X + B} & \pmatrix{I & \mathbf{0} \\ X & -I} & \\ =\pmatrix{ X & -I \\ X^2 + BX + C& -X - B} & \\ \end{align}

gives your equation in $X$, and if the equation is solved, then the matrix $\pmatrix{0 & I \\ -C & -B}$ is block diagonalized. Also note that there is a closed form solution to bring (almost) any matrix into the form $\pmatrix{0 & I \\ -C & -B}$ through a similarity transform. I do not know what this form is called in the literature, but I like to call it the block companion form. Here is how to do it

\begin{align} \pmatrix{G^{-1} & \mathbf{0} \\ G^{-1}M & I}\pmatrix{M & G \\ F & D} \pmatrix{G & \mathbf{0} \\ -G^{-1}MG & I} & \\ =\pmatrix{G^{-1}M & I \\ G^{-1}M^2 + F & G^{-1}MG + D} \pmatrix{G & \mathbf{0} \\ -G^{-1}MG & I} & \\ =\pmatrix{ \mathbf{0} & I \\ G^{-1}M^2G + FG - G^{-1}M^2G - DG^{-1}MG & D+G^{-1}MG} & \\ =\pmatrix{ \mathbf{0} & I \\ (F-DG^{-1}M)G & D+G^{-1}MG} & \\ =\pmatrix{0 & I \\ -C & -B} & \\ \end{align}

If a solution to $X^2 + BX + C = \mathbf{0}$ were possible in a closed form here, then it could split the eigenvalue problem in half, be applied recursively and thus have a closed form solution to the eigenvalue problem. The existence of such a solution has already been dis-proven by the Abel-Ruffini Theorem.

Since diagonalization is a difficult problem without a closed form solution, it is no wonder you had such difficulty finding such a solution. Maybe this is why the equation is related to what is named the quadratic eigenvalue problem as J.M. noted.