Rigorously show that the maximum variance of an hermitian matrix is $\left( \frac{h_{max}-h_{min}}{2} \right) ^2$

hermitian-matriceslagrange multiplieroptimization

Suppose we have an hermitian matrix $H$ with eigenvalues $h_i$, and I want to get the minimum value of its variance, i.e. $\vec{v}^{\dagger}H^2\vec{v}-\left( \vec{v}^{\dagger}H\vec{v} \right) ^2$. How can we rigorously show this?

I tried Lagrange's multiplier method which I think nearly succeed. By spectral decomposition, we can write $H$ as $H=\sum_i\lambda_i|i\rangle\langle i|$(let me use $|i\rangle$ to stand for column vectors while $\langle i|\equiv|i\rangle^\dagger$). Then the problem changes into minimize $$\sum_i{{h_i}^2}\langle v|i\rangle \langle i|v\rangle -\left( \sum_i{h_i}\langle v|i\rangle \langle i|v\rangle \right) ^2.$$
over unit vectors $|v\rangle$.

By replace $\langle v|i\rangle \langle i|v\rangle $ with $p_i$, we finally state the problem as
$$
\underset{p_i}{minimize}\sum_i{{h_i}^2}p_i-\left( \sum_i{h_i}p_i \right) ^2
\\
s.t. \sum_i{p_i=1}
$$

Now we can start Lagrange's multiplier method. By introduce $\lambda$, we have the Lagrangian is $\sum_i{{h_i}^2}p_i-\left( \sum_i{h_i}p_i \right) ^2+\lambda \left( \sum_i{p_i}-1 \right) $, by derivative w.r.t. $p_i$ we have the linear equation:
$$
{h_k}^2-2\left( \sum_i{h_ip_i} \right) h_k=\lambda ,\forall k
\\
\sum_i{p_i}=1
$$

We then solve those equations and maybe it might have some easier results by noticing that $\lambda$ only has one value which does not change with different $h_k$.

For example, if we have $p_1$ to $p_3$, we can solve this by writing it in matrix form:
$$
\left( \begin{matrix}
1& 1& 1& 0\\
2h_1h_1& 2h_2h_1& 2h_3h_1& 1\\
2h_1h_2& 2h_2h_2& 2h_2h_3& 1\\
2h_1h_3& 2h_2h_3& 2h_3h_3& 1\\
\end{matrix} \right) \left( \begin{array}{c}
p_1\\
p_2\\
p_3\\
\lambda\\
\end{array} \right) =\left( \begin{array}{c}
1\\
{h_1}^2\\
{h_2}^2\\
{h_3}^2\\
\end{array} \right)
$$

Is there some easier way to solve it? I can't directly see the answer and using Gaussian elimination seems too complicated for more variables.

Best Answer

Without loss of generality we may assume that $H$ is real diagonal. In that case, you have--denoting $\lambda_j$ for the (positive, real) eigenvalues, $$ F(v) = \sum_{j} \lambda_j^2 v_j^2 - \Big(\sum_j \lambda_j v_j^2\big)^2 = \|v\|_2^2 \mathrm{Var}(X), $$ where $X$ is a random variable taking value $\lambda_j$ with probability $v_j^2/\|v\|_2^2$. Evidently $X \in [\lambda_{\rm min}, \lambda_{\rm max}]$, hence it follows that $$ \mathrm{Var}(X) \leq \Big(\frac{\lambda_{\rm max} - \lambda_{\rm min}}{2}\Big)^2. $$ And therefore, $F(v) \leq \Big(\frac{\lambda_{\rm max} - \lambda_{\rm min}}{2}\Big)^2 \|v\|_2^2$, as required.

Related Question