[Math] Orthogonal Projection onto the Weighted $ {L}_{2} $ Norm Ball

convex optimizationMATLABnonlinear optimizationoptimization

Projection onto the $\ell$2 norm ball is known. Let the norm ball set reads
$C = \left\{x \in \mathbb{R}^n: \left\| x – c \right\|_2^2 \leq d^2 \right\}$, where $c \in \mathbb{R}^n$ and $d \in \mathbb{R}$, then the projection onto this norm ball set can be shown as
\begin{align}
\Pi_{C} \left ( y \right) = c + \frac{d}{\max \left\{\left\| y – c \right\|_2, d \right\}} \left( y – c \right) \ .
\end{align}

How to obtain a projection onto the weighted $\ell$2 norm ball, where the set $C = \left\{x \in \mathbb{R}^n: \left\| x – c \right\|_W^2 \leq d^2 \right\}$, where $W$ is a positive definite matrix?


Note: There is an attempt for weighted l2 norm projection, but we don't have the same problem definition (or may be I can't figure out yet how to massage that answer).


My partial attempt:

Following Brian's comments, I have made an attempt. But I am stuck and not able to compute the Lagrange multiplier.

The weighted $\ell$2 projection problem can be expressed as
\begin{align}
\text{minimize}_{x \in \mathbb{R}^n} \quad & \left\|y-x\right\|_2^2\\
\text{subject to }\quad & \left\|x-c\right\|_W^2 \leq d^2
\end{align}

Forming the Lagrangian:
\begin{align*}
L\left(x, \lambda\right)
&= \left\|y-x\right\|_2^2 + \lambda \left( \left[ (x-c)^T W (x-c) \right]- d^2 \right) .
\end{align*}

Then according to the stationarity condition of the KKT conditions,
\begin{align*}
\frac{\partial L}{\partial x} = 0 \Rightarrow
&= -(y – x) + \lambda \left(W (x-c) \right) = 0 \\
&\Leftrightarrow x = \left( I + \lambda W \right)^{-1} \left(\lambda W c + y \right).
\end{align*}

Now I am stuck and don't know how to obtain $\lambda$ in closed-form? I thought about considering the complementary slackness of the KKT conditions, but then it becomes too much involved for me.
Say $\lambda > 0$, then
\begin{align}
(x-c)^T W (x-c) = d^2 \ .
\end{align}

If I plugin $x = \left( I + \lambda W \right)^{-1} \left(\lambda W c + y \right)$ in the above equation, then I don't know how to solve for $\lambda$. If we can't find it in closed-form, then how to obtain this iteratively? How about if $W$ is a diagonal matrix, then can we obtain $\lambda$ in closed-form?


Further attempt: If $W$ is a diagonal matrix, then following the link, $i$th coordinate of $x$ can be written as
\begin{align}
x_i = \frac{\lambda W_i c_i + y_i}{1 + \lambda W_i} .
\end{align}

Now, plugging into the complementary slackness condition such that
\begin{align}
(x-c)^T W (x-c) &= \|W^{1/2} (x-c)\|^2 = d^2 \\
&\Leftrightarrow \sqrt{W_i} (x_i – c_i) = \pm d \\
&\Leftrightarrow \lambda = \frac{\pm 1}{d \sqrt{W_i}} \left(y_i – c_i \right) – \frac{1}{W_i} \ .
\end{align}

Is this correct in case of $W$ is a diagonal matrix?

Question: Would $\lambda$ be different for different coordinates of the vector? Hmm, should it not be a constant for all coordinates?

Best Answer

The objective function is given by:

$$\begin{align*} \arg \min_{x} \quad & \frac{1}{2} {\left\| x - y \right\|}_{2}^{2} & \text{} \\ \text{subject to} \quad & {\left( x - c \right)}^{T} W \left( x - c \right) \leq d \end{align*}$$

Case I - Diagonal Matrix

In this case the matrix $ W $ is diagonal with $ {w}_{ii} \geq 0 $.

Let's assume we know how to solve this.

Remark
I could find a simple iterative method to find $ \lambda $ for this case but not a closed form solution. Though I'd guess it is doable. As we can change coordinates to make the Ellipsoid into a Ball and then go back.

Case II - Positive Definite Matrix

In this case the matrix $ W $ a Positive Semi Definite (PSD) Matrix - $ W \succeq 0 $.

Since $ W $ is a PSD matrix it can be written as (Eigen Decomposition):

$$ W = {P}^{T} D P $$

Where $ P $ is Unitary Matrix and $ D $ is Diagonal Matrix with $ {d}_{ii} \geq 0 $.

Then one could rewrite the problem as:

$$\begin{align*} \arg \min_{x} \quad & \frac{1}{2} {\left\| x - y \right\|}_{2}^{2} & \text{} \\ \text{subject to} \quad & {\left( x - c \right)}^{T} {P}^{T} D P \left( x - c \right) \leq d \end{align*}$$

Defining $ v = P x $, $ z = P y $ and $ e = P c $ one could rewrite the problem as:

$$\begin{align*} \arg \min_{x} \quad & \frac{1}{2} {\left\| x - y \right\|}_{2}^{2} & \text{} \\ \text{subject to} \quad & {\left( v - e \right)}^{T} D \left( v - e \right) \leq d \end{align*}$$

Now, since $ P $ is Unitary Matrix which means it preserves the $ {L}_{2} $ norm and is invertible then $ \frac{1}{2} {\left\| x - y \right\|}_{2}^{2} = \frac{1}{2} {\left\| P \left( x - y \right) \right\|}_{2}^{2} = \frac{1}{2} {\left\| v - z \right\|}_{2}^{2} $ and the whole problem becomes:

$$\begin{align*} \arg \min_{v} \quad & \frac{1}{2} {\left\| v - z \right\|}_{2}^{2} & \text{} \\ \text{subject to} \quad & {\left( v - e \right)}^{T} D \left( v - e \right) \leq d \end{align*}$$

Where $ D $ is Diagonal Matrix which is exactly Case I we assume we know how to solve.

Code

I created a MATLAB code to solve the problem (The general).
The code is available at my StackExchange Mathematics Q3079400 GitHub Repository.