[Math] Fast algorithm for maximizing smallest eigenvalue of linear combination of hermitian matrices

convex optimizationeigenvectorlinear algebramatricesoc.optimization-and-control

I have an engineering back ground. Due to work, I came across this problem
\begin{align}
&\max_{\lambda,y_i\in \mathbb{R}}~\lambda \\\
s.t.~&\left(\mathbf{A}_0+\sum_{i=1}^{K}y_i\mathbf{A}_i\right)-\lambda\mathbf{I}\geq 0
\end{align}
where $\mathbf{A}_i$ are all hermitian matrices. We are seeking $\lambda$ and $y_i$. I know that this is called a Linear Matrix Inequality problem and can be solved by a general convex package (for eg, CVX). To me, it seems like we are looking for a matrix formed from the linear combination of given hermitian matrices whose smallest eigenvalue is as maximum as possible among all such combinations. I was wondering if they are iterative algorithms to solve this problem which are simple to implement. Please point me to relevant references.

Best Answer

Two iterative algorithms that solve LMI problems is the ellipsoid algorithm and interior-point methods. Both are described in sections 2.3 and 2.4 of Stephen Boyd's book "Linear Matrix Inequalities in System and Control Theory" [1] and in the references therein.

See also [2] for already implemented solvers. In particular, if you use MATLAB I recommend using the SeDuMi solver with the YALMIP parser [3], since this one allows one to input the LMI programs to the solver in a more intuitive way.

Related Question