Difficult matrix equation $A(x-x_0) + \alpha\left(\frac{\Sigma x}{\mu^\top x}\right) = 0$

linear algebramatricesmatrix equationsmatrix-calculussystems of equations

Let $x,x_0,\theta,\mu \in \mathbb{R}^n$ vectors, $\alpha \in\mathbb{R}$ a scalar, and $\Sigma, A\in \mathbb{R}^{n\times n}$ Positive Semi-Definite matrices. Also, $\mu$ is a mean vector and $\Sigma$ is a covariance matrix of a Random Variable $Z$ (I don't know if this information may help). Is there a way to solve in closed form the following matrix equation for $x$?

\begin{equation}
A(x-x_0) + \alpha \left(\frac{\Sigma x}{\mu^\top x}\right) = 0.
\end{equation}

Main difficulty: I find difficulties in handling the denominator. If the equation would have been simply

$$A(x-x_0) + \alpha \Sigma x = 0,$$ the solution would be $x = (A+\alpha\Sigma)^{-1}Ax_0$. I am looking for a similar solution for the first equation but the denominator complicates things and I don't know how to proceed.

What I tried:

\begin{aligned}
& \mu^TxA(x-x_0) + \alpha \Sigma x = 0
\\ &\mu^TxAx – \mu^TxAx_0 – \alpha \Sigma x = 0
\\ &\mu^TxA(x-x_0) = \alpha \Sigma x
\end{aligned}

Is this way of proceeding correct? If yes, how is it possible to solve this matrix equation for $x$? I am not an expert in solving matrix equations and any help would be greatly appreciated. Thanks!

Best Answer

I can provide some hints for the case where $A,\Sigma$ are strictly positive definite. This assumption guarantees us that both matrices are invertible; let us set henceforth $\lambda=\frac{\alpha}{\mu^T x}$. By multiplying on the right by $\Sigma^{-1}$ the problem transforms into the following one:

$$(B-\lambda I)x=B x_0~~,~~B=\Sigma^{-1}A$$

Of course $B x_0\neq 0$, since $B$ is itself positive definite. Now it is finally obvious that $B-\lambda I$ can only be singular for $\lambda $ in the set of eigenvalues of the matrix $B$ itself, in which case a solution does not exist. For all other values of the parameter a unique solution is guaranteed; if one uses the eigenvalue decomposition of $B=S\Lambda S^{-1}$ the solution is given by the relation

$$x=S(\Lambda-\lambda I)^{-1}S^{-1}Bx_0$$

This solution is naturally, a function of $\lambda$ itself, which can be written in the concise form $$x=\sum_{k}\frac{v_k}{\lambda_k-\lambda}$$ where the sum is performed over the distinct eigenvalues of $B$ with the vectors $(v_k)_i=S_{ik}\sum_{lm}S^{-1}_{kl}B_{lm}(x_0)_{m}$ whenever all the eigenvalues have multiplicity 1 (but with trivial modifications if it's greater than 1-just take the limit of coinciding eigenvalues).

We are not done- $\lambda$ depends on $x$ itself and that imposes a constraint on the possible solutions. Dotting from the left with the vector $\mu$, we obtain the rational equation $$\frac{\alpha}{\lambda}=\sum_{k}\frac{\mu^Tv_k}{\lambda_k-\lambda}$$ which generally has $n$ complex roots. However $\lambda$ is a real parameter. Is a solution guaranteed? It is only guaranteed for odd $n$, since the polynomial equation then is forced to have a real solution. A counterexample to this is given by the function $$f(\lambda)=-\frac{1}{1-\lambda}+\frac{1}{2-\lambda}-\frac{3}{\lambda}$$ which has no real roots. However, it is true that there may be found a large enough value of $|\alpha|$ for which a real solution exists, regardless of the values of $n$ and the projections $\mu^T v_k$.

EDIT: I believe that this is as close algorithmically as one can get to a "closed form". The problem has certainly been reduced to a simpler one, but the impasse is obvious- we cannot solve arbitrary degree polynomial equations in closed form.