Since the above it the Orthogonal Projection onto the Unit Simplex here is the solution using the Dual Function.
The Lagrangian in that case is given by:
$$ \begin{align}
L \left( x, \mu \right) & = \frac{1}{2} {\left\| x - y \right\|}^{2} + \mu \left( \boldsymbol{1}^{T} x - 1 \right) && \text{} \\
\end{align} $$
The trick is to leave non negativity constrain implicit.
Hence the Dual Function is given by:
$$ \begin{align}
g \left( \mu \right) & = \inf_{x \succeq 0} L \left( x, \mu \right) && \text{} \\
& = \inf_{x \succeq 0} \sum_{i = 1}^{n} \left( \frac{1}{2} { \left( {x}_{i} - {y}_{i} \right) }^{2} + \mu {x}_{i} \right) - \mu && \text{Component wise form}
\end{align} $$
Again, taking advantage of the Component Wise form the solution is given:
$$ \begin{align}
{x}_{i}^{\ast} = { \left( {y}_{i} - \mu \right) }_{+}
\end{align} $$
Where the solution includes the non negativity constrain by Projecting onto $ {\mathbb{R}}_{+} $
Again, the solution is given by finding the $ \mu $ which holds the constrain (Pay attention, since the above was equality constrain, $ \mu $ can have any value and it is not limited to non negativity as $ \lambda $ above).
The objective function (From the KKT) is given by:
$$ \begin{align}
h \left( \mu \right) = \sum_{i = 1}^{n} {x}_{i}^{\ast} - 1 & = \sum_{i = 1}^{n} { \left( {y}_{i} - \mu \right) }_{+} - 1
\end{align} $$
The above is a Piece Wise linear function of $ \mu $ and its Derivative given by:
$$ \begin{align}
\frac{\mathrm{d} }{\mathrm{d} \mu} h \left( \mu \right) & = \frac{\mathrm{d} }{\mathrm{d} \mu} \sum_{i = 1}^{n} { \left( {y}_{i} - \mu \right) }_{+} \\
& = \sum_{i = 1}^{n} -{ \mathbf{1} }_{\left\{ {y}_{i} - \mu > 0 \right\}}
\end{align} $$
Hence it can be solved using Newton Iteration.
I wrote MATLAB code which implements them both at Mathematics StackExchange Question 2338491 - GitHub.
There is a test which compares the result to a reference calculated by CVX.
The objective function is given by:
$$\begin{align*}
\arg \min_{x} \quad & \frac{1}{2} {\left\| x - y \right\|}_{2}^{2} & \text{} \\
\text{subject to} \quad & {\left( x - c \right)}^{T} W \left( x - c \right) \leq d
\end{align*}$$
Case I - Diagonal Matrix
In this case the matrix $ W $ is diagonal with $ {w}_{ii} \geq 0 $.
Let's assume we know how to solve this.
Remark
I could find a simple iterative method to find $ \lambda $ for this case but not a closed form solution. Though I'd guess it is doable. As we can change coordinates to make the Ellipsoid into a Ball and then go back.
Case II - Positive Definite Matrix
In this case the matrix $ W $ a Positive Semi Definite (PSD) Matrix - $ W \succeq 0 $.
Since $ W $ is a PSD matrix it can be written as (Eigen Decomposition):
$$ W = {P}^{T} D P $$
Where $ P $ is Unitary Matrix and $ D $ is Diagonal Matrix with $ {d}_{ii} \geq 0 $.
Then one could rewrite the problem as:
$$\begin{align*}
\arg \min_{x} \quad & \frac{1}{2} {\left\| x - y \right\|}_{2}^{2} & \text{} \\
\text{subject to} \quad & {\left( x - c \right)}^{T} {P}^{T} D P \left( x - c \right) \leq d
\end{align*}$$
Defining $ v = P x $, $ z = P y $ and $ e = P c $ one could rewrite the problem as:
$$\begin{align*}
\arg \min_{x} \quad & \frac{1}{2} {\left\| x - y \right\|}_{2}^{2} & \text{} \\
\text{subject to} \quad & {\left( v - e \right)}^{T} D \left( v - e \right) \leq d
\end{align*}$$
Now, since $ P $ is Unitary Matrix which means it preserves the $ {L}_{2} $ norm and is invertible then $ \frac{1}{2} {\left\| x - y \right\|}_{2}^{2} = \frac{1}{2} {\left\| P \left( x - y \right) \right\|}_{2}^{2} = \frac{1}{2} {\left\| v - z \right\|}_{2}^{2} $ and the whole problem becomes:
$$\begin{align*}
\arg \min_{v} \quad & \frac{1}{2} {\left\| v - z \right\|}_{2}^{2} & \text{} \\
\text{subject to} \quad & {\left( v - e \right)}^{T} D \left( v - e \right) \leq d
\end{align*}$$
Where $ D $ is Diagonal Matrix which is exactly Case I we assume we know how to solve.
Code
I created a MATLAB code to solve the problem (The general).
The code is available at my StackExchange Mathematics Q3079400 GitHub Repository.
Best Answer
This is basically Projection onto the Simplex with some modifications.
The problem is given by:
$$ \begin{alignat*}{3} \arg \min_{x} & \quad & \frac{1}{2} \left\| x - y \right\|_{2}^{2} \\ \text{subject to} & \quad & 0 \leq {x}_{i} \leq \alpha \\ & \quad & \boldsymbol{1}^{T} x = 1 \end{alignat*} $$
The problem is valid for $ \alpha \geq \frac{1}{n} $ otherwise the constraint $ \boldsymbol{1}^{T} x = 1 $ isn't feasible.
For $ \alpha \geq 1 $ the problem matches the Projection onto the Simplex as the upper boundary can not be an active constraint (Well it is for 1, but then it is equivalent for the equality constraint and the non negativity).
The Lagrangian in that case is given by:
$$ \begin{align} L \left( x, \mu \right) & = \frac{1}{2} {\left\| x - y \right\|}^{2} + \mu \left( \boldsymbol{1}^{T} x - 1 \right) && \text{} \\ \end{align} $$
The trick is to leave non negativity constrain implicit.
Hence the Dual Function is given by:
$$ \begin{align} g \left( \mu \right) & = \inf_{0 \leq {x}_{i} \leq \alpha} L \left( x, \mu \right) && \text{} \\ & = \inf_{0 \leq {x}_{i} \leq \alpha} \sum_{i = 1}^{n} \left( \frac{1}{2} { \left( {x}_{i} - {y}_{i} \right) }^{2} + \mu {x}_{i} \right) - \mu && \text{Component wise form} \end{align} $$
Taking advantage of the Component Wise form the solution is given:
$$ \begin{align} {x}_{i}^{\ast} = { \left( {y}_{i} - \mu \right) }_{0 \leq \cdot \leq \alpha} \end{align} $$
Where the solution includes the inequality constrains by Projecting onto the box $ \mathcal{B} = \left\{ x \mid 0 \leq {x}_{i} \leq \alpha \right\} $.
The solution is given by finding the $ \mu $ which holds the constrain (Pay attention, since the above was equality constrain, $ \mu $ can have any value and it is not limited to non negativity as $ \lambda $).
The objective function (From the KKT) is given by:
$$ \begin{align} 0 = h \left( \mu \right) = \sum_{i = 1}^{n} {x}_{i}^{\ast} - 1 & = \sum_{i = 1}^{n} { \left( {y}_{i} - \mu \right) }_{0 \leq \cdot \leq \alpha} - 1 \end{align} $$
The above is a Piece Wise linear function of $ \mu $.
Since the function is continuous yet it is not differentiable due to its piece wise property theory says we must use derivative free methods for root finding. One could use the Bisection Method for instance.
MATLAB Code
I wrote a MATLAB code which implements the method with Bisection Root Finding. I verified my implementation vs. CVX. The MATLAB Code which is accessible in my StackExchange Mathematics Q3972913 GitHub Repository.