Condition for positive semidefinite operator in Hilbert space

hilbert-spaceslinear algebrapositive-semidefinitequantum mechanicsquantum-computation

In Nielsen and Chuang exercise 2.64, the following problem is given:

Suppose Bob is given a quantum state chosen from a set $\{ \lvert \psi_1 \rangle, \ldots , \lvert \psi_m \rangle \}$ of linearly independent states [in some Hilbert space]. Construct a POVM $\{E_1, E_2, \ldots, E_{m+1} \}$ such that if outcome $E_i$ occurs, $1 \le i \le m$, then Bob knows with certainty that he was given the state $\lvert \psi_i \rangle$. (The POVM must be such that $\langle \psi_i | E_i | \psi_i\rangle > 0$ for each i.)

Here a POVM is a positive operator-valued measurement, or a set $\{E_i\}$ of positive (semidefinite) operators satisfying the completeness condition
$$ \sum_i E_i = I. $$
Note that in this problem $\lvert \psi_i \rangle$ are not assumed to be orthogonal.

My approach to this problem is to construct each $E_i$ for $i \le m$ as a scaled projector:
$$ E_i = \alpha_i P_i, $$
where $P_i$ is the projector onto the orthogonal complement of $\mathrm{Span}(\lvert \psi_1 \rangle, \ldots, \lvert \hat{\psi_i} \rangle, \ldots, \lvert \psi_m \rangle)$ in $\mathrm{Span}(\lvert \psi_1 \rangle, \ldots, \lvert \psi_m \rangle)$ satisfying $\langle \psi_i | P_i | \psi_i \rangle > 0$, and $\alpha_i$ is some positive real number.

Then we have the desired properties that each $E_i$ is positive, and $\langle \psi_i | E_j | \psi_i \rangle = \delta_{ij} \beta_i$ for some real positive $\beta_i$. Indeed, $E_i$ has eigenvalues $\alpha_i$ and $0$; $E_j | \psi_i \rangle = 0$ if $i \ne j$ as $P_j$ is a projector onto a space orthogonal to $\lvert \psi_i \rangle$; and
$$\langle \psi_i | E_i | \psi_i \rangle = \alpha_i \cos \theta_i \langle \psi_i | \psi_i \rangle = \beta_i > 0, $$
where $\theta_i$ is the angle between $\lvert \psi_i \rangle$ and $P_i \lvert \psi_i \rangle$.

Then for any choice of $\{\alpha_i\}$ we have constructed positive $E_i$ for $i \le m$, but this set of positive operators is not necessarily complete. To complete the set, let $E_{m+1} = I – \sum_{i \le m} E_i$. It seems that as long as we can ensure $E_{m+1}$ is positive, then we have constructed the desired POVM. However, I am not sure how to assign $\{\alpha_i\}$ in order to do so.

Naively, I can't see why we don't have a great deal of freedom over $\alpha_i$. As a sum of Hermitian operators, $E_{m+1}$ is automatically Hermitian. And $\langle \psi_i | E_{m+1} | \psi_i \rangle = \langle \psi_i | \psi_i \rangle – \beta_i \ge 0$ so long as $\alpha_i \cos \theta_i \le 1$. However, this so-called POVM would be able to distinguish the states $\lvert \psi_i \rangle$ perfectly if we were allowed to choose $\alpha_i = 1/\cos \theta_i$, in violation of the no-cloning theorem (as shown in Nielsen–Chuang, exercise 1.2, with stronger bounds implicit in Box 2.3).

I feel I must be mistaken about the criterion for $E_{m+1}$ to be positive—it does not seem sufficient to ensure that $\langle \psi_i | E_{m+1} | \psi_i \rangle \ge 0$ for all $i$, even though they are linearly independent.

Is this a fruitful approach for this problem? If so, what conditions on $\{\alpha_i\}$ would ensure $E_{m+1}$ is positive? What have I missed?

Best Answer

Your approach seems perfectly fine and probably what the authors intended.

Note that there is no need for us to find the "best" (i.e. largest) $\alpha_i$ such that $I - \sum_i E_i$ is positive definite. Note that $\lambda$ is an eigenvalue of $M$ if and only if $1 - \lambda$ is an eigenvalue of $I - M$. Thus, for a positive semidefinite matrix $M$, $I - M$ will be positive definite if and only if all eigenvalues of $M$ are at most equal to $1$. Moreover, if $M$ is positive semidefinite, then the largest eigenvalue of $M$ is equal to $\|M\|$, the "spectral norm" of $M$. Now, note that $$ \left\|\sum_i P_i \right\| \leq \sum_{i} \|P_i\| = \sum_i 1 = m. $$ Thus, if we take $\alpha_i = \frac 1m$ for all $i$, we find that $$ \left\|\sum_i E_i \right\| = \left\|\frac 1m \sum_i P_i \right\| = \frac 1m \left\|\sum_i P_i \right\| \leq 1, $$ So that $I - \sum_i E_i$ is positive semidefinite, as desired.


For another perspective, note that for a Hermitian matrix $M$, $I - M$ is positive definite iff for all $|\phi\rangle$, we have $$ \langle \phi|I - M|\phi \rangle \geq 0 \implies\\ \langle \phi|I|\phi \rangle - \langle \phi|M|\phi \rangle \geq 0 \implies\\ \langle \phi |M|\phi \rangle \leq \langle \phi |\phi \rangle. $$ On the other hand, note that $\langle \phi |P_i |\phi\rangle \leq \langle \phi |\phi \rangle$, so that $$ \left\langle \phi \left| \sum_i P_i \right| \phi \right\rangle \leq \sum_i \langle \phi |P_i|\phi \rangle \leq m \cdot \langle \phi|\phi \rangle. $$ By a similar argument to the previous approach, we can conclude that taking $\alpha_i = \frac 1m$ ensures that $I - \sum_i E_i$ is positive semidefinite.


If you were interested in finding the maximal values of $\alpha_i$ (say, in the sense of maximizing $\sum_i \alpha_i$) such that $E_{m+1}$ is still positive semidefinite, you could do so via a dual semidefinite program. In particular, if $\alpha$ and $b$ are the column-vectors $\alpha = (\alpha_1,\alpha_2,\dots,\alpha_m)$ and $b = (1,1,\dots,1)$, then we aim to solve the optimization problem $$ \max_{\alpha \in \Bbb R^n} b^\top \alpha \quad \text{subject to } \quad \sum_{i=1}^m \alpha_i P_i \preceq I. $$ Suffice it to say, such problems are highly non-trivial and typically done with computer assistance.

This of course is much simpler in the case where the $|\psi_i\rangle$ are mutually orthogonal, which would imply that the $E_i$ are mutually orthogonal projections, which would imply that the constraint is equivalent to $\alpha_i \leq 1$ for all $i$.

Related Question