Dual Problem Solution
The Lagrangian is given by:
$$ L \left( x, \lambda, \nu \right) = {x}^{T} x + {\lambda}^{T} \left( x - a \right) + \nu \left( \boldsymbol{1}^{T} x - b \right) = {x}^{T} x + \left(
\lambda + \nu \boldsymbol{1} \right)^{T} x -{\lambda}^{T} a - \nu b $$
The Dual Function is given by:
$$ g \left( \lambda, \nu \right) = \inf_{x} L \left( x, \lambda, \nu \right) $$
Looking at the term related to $ x $:
$$ \inf_{x} {x}^{T} x + \left( \lambda + \nu \boldsymbol{1} \right)^{T} x $$
Which is a quadratic form of $ x $ with its minimizer given by:
$$ {x}^{\ast} = -\frac{1}{2} \left(
\lambda + \nu \boldsymbol{1} \right) $$
Its minimum given by
$$ {{x}^{\ast}}^{T} {x}^{\ast} + \left(
\lambda + \nu \boldsymbol{1} \right)^{T} {x}^{\ast} = -\frac{1}{4} \left(
\lambda + \nu \boldsymbol{1} \right)^{T} \left( \lambda + \nu \boldsymbol{1} \right) $$
Hence the Dual Problem is given by:
\begin{align*}
\text{maximize} & \quad & -\frac{1}{4} \left(
\lambda + \nu \boldsymbol{1} \right)^{T} \left( \lambda + \nu \boldsymbol{1} \right) - {\lambda}^{T} a - \nu b \\
\text{subject to} & \quad & \lambda \succeq 0
\end{align*}
The problem is Concave in $ \left( \lambda, \nu \right) $ hence it is a convex problem.
It can be solved by a Quadratic Programming as:
$$ \left(
\lambda + \nu \boldsymbol{1} \right)^{T} \left( \lambda + \nu \boldsymbol{1} \right) = {\left\| E v \right\|}_{2}^{2} = {v}^{T} {E}^{T} E v = {v}^{T} H v $$
Where $ v = {\left[ \lambda, \nu \right]}^{T}, \; E = \left[ I, \boldsymbol{1} \right], \; H = {E}^{T} E $. Then the problem becomes:
$$
\begin{align*}
\text{minimize} & \quad & \frac{1}{4} {v}^{T} H v + {v}^{T} f \\
\text{subject to} & \quad & A v \preceq 0
\end{align*}
$$
Where $ A = - \left[ I, \boldsymbol{0} \right], \; f = {\left[ a, b \right]}^{T} $.
The above can directly solved by MATLAB's quadprog()
. Then $ {x}^{\ast} = -0.5 E v $.
Sure, in any problem with a single inequality constraint
$$\min_x\ f(x) \quad \mathrm{s.t.} \quad g(x) \leq 0,$$
whether convex or not, the solutions will be of two types:
unconstrained minimizers of $f(x)$ with the inequality constraint inactive, $g(x)<0$;
constrained minimizers with the inequality constraint active, $g(x)=0$.
The geometry here is fairly intuitive: if $g(x)<0$ and yet you're not not at an unconstrained minimum of $f(x)$, you can descent down $-\nabla f$ until you either hit the boundary $g=0$ or reach a minimizer of $f(x)$. This argument can be made rigorous with enough regularity assumptions on $f$ and $g$, etc.
If you can prove there are no solutions of type (1) then all must be of type (2).
Best Answer
To use a standard Lagrangian approach, and still assuming that everything is real, since in the contrary case, $x^HMy$ is complex and can not be maximized, and if it is replaced by $Re(x^HMy)$, then one may forget the complex structure and double the dimension from complex to real,...
... the Lagrange function is \begin{align} L(M,\Lambda)&=∥MA−B∥^2_F−x^TMy-tr(Λ(M^TM-I))\\ &=∥A∥^2_F+∥B∥^2_F-2tr(MAB^T)−tr(yx^TM)-tr(Λ(M^TM-I)) %tr(ΛM^TX+ΛX^TM) \end{align} with derivative $$ 0=\frac{∂L}{∂M}=-2AB^T-yx^T-(Λ+Λ^T)M^T $$ or $$ M(Λ+Λ^T)=-2BA^T-xy^T $$ There is no unique solution to this problem, one special solution can be found by noting that if $(Λ+Λ^T)$ is positive definite, then the left side is the polar decomposition of the right side. Which again can be computed using the SVD of the right side.