[Math] Matrix Projection onto Positive Semi Definite Cone with Respect to the Spectral Norm

convex optimizationleast squareslinear algebrapositive-semidefiniteprojection

A book that I am reading ("Convex Optimization" by S.Boyd and L.Vandenberghe, page 399, book pdf) states that projection of a symmetric $n\times n$ matrix $X_0$ onto the set of symmetric $n \times n$ positive-semidefinite matrices $S^n_+$ is found in the following way:

  • Find the spectral (eigenvalue) decomposition $X_0 = \sum_{i=1}^n \lambda_i v_i v_i^T$;
  • Define the projection $X = \sum_{i=1}^n \max\{\lambda_i, 0\} v_i v_i^T$.

It is proven in the book that $X$ is the solution of the problem

minimize $||X-X_0||_F^2$ subject to $X \succeq 0$.

Here $||A||_F^2 = tr(A^T A)$ is a square of the Frobenius norm. In other words, $X$ is the projection of $X_0$ onto symmetric-positive-semidefinite matrix cone wrt to Frobenius norm.

I have also found a proof of this in this question.

Now the book also states (without a proof, or at least it is not obvious for me from the material) that $X$ is also a solution to the problem

minimize $||X-X_0||_2$ subject to $X \succeq 0$.

Here $||A||_2$ ($A$ being symmetric) is the spectral norm, $||A||_2 = \max_{i=1, \dots, n}|\lambda_i| = \max\{\lambda_1, -\lambda_n\}$ ($\lambda_i$ are the eigenvalues of $A$, $\lambda_1 \geq \lambda_2 \geq \dots \geq \lambda_n$).

How to prove this?

Best Answer

Let $X_0 = \sum_{i=1}^n \lambda_i v_i v_i^T$ be the eigenvalue decomposition of matrix $X_0$ with $\lambda_1 \geq \lambda_2 \geq \dots \geq \lambda_n$. Assume $\lambda_n < 0$, as otherwise the projection of $X_0$ onto positive semidefinite cone would be itself. Then we have \begin{equation} v_i^T X_0 v_i = \sum_{j=1}^n \lambda_i v_i^T v_j v_j^T v_i = \lambda_i \end{equation} and \begin{equation} ||X_0||_2 = \max\{\lambda_1, -\lambda_n\} = \max\{v_1^T X_0 v_1, -v_n^T X_0 v_n\} = \max\{\sup_{||v||_2=1} v^T X_0 v, -\inf_{||v||_2=1} v^T X_0 v\} \end{equation} Now let $X$ be any symmetric positive semidefinite matrix. We have \begin{equation} ||X - X_0||_2 \geq \sup_{||v||_2 = 1} v^T (X - X_0) v \geq v_n^T (X - X_0) v_n = v_n^T X v_n - v_n^TX_0v_n \geq -\lambda_n \end{equation} Now if we define $X = \sum_{i=1}^n \max\{\lambda_i, 0\} v_i v_i^T$, we have $||X - X_0||_2 = -\lambda_n$. Therefore, the defined matrix is the projection of $X_0$ onto positive semidefinite cone.

Related Question