Once I fixed a crucial typo, some things became clearer. Since we're assuming $\Phi+\Psi$ is positive definite, we have $\langle (\Phi+\Psi)e_i,e_i\rangle > 0$ for $i=1,2$, so
$\langle\Phi(e_i),e_i\rangle > -\langle\Psi(e_i),e_i\rangle$ for $i=1,2$. "Therefore," so to speak,
$$\langle\Phi(e_1)e_1\rangle\langle\Phi(e_2),e_2\rangle>\langle\Psi(e_1),e_1\rangle\langle\Psi(e_2),e_2\rangle,$$
establishing inequality (1). This is the proof the authors intended, but of course it's wrong unless we are assuming $\Psi$ is negative definite here, so that both right-hand sides are positive and we can multiply the inequalities.
Equality (2) is just the computation of $\det\Psi$ using a matrix representation with respect to the basis $\{e_1,e_2\}$.
Now that we've narrowed things down to assuming that $\Phi$ is positive definite and $\Psi$ is negative definite, let's look at (3). This is also following from a determinant computation:
\begin{align*}
\det\Phi&=\langle \Phi(u_1),u_1\rangle \langle \Phi(u_2),u_2\rangle - \langle \Phi(u_1),u_2\rangle^2\\
\det\Psi &= \langle \Psi(u_1),u_1\rangle\langle \Psi(u_2),u_2\rangle -\langle \Psi(u_1),u_2\rangle^2.
\end{align*}
Since $\det\Phi=\det\Psi$, substituting the first two equalities, we get
$$\langle \Phi(u_1),u_1\rangle \langle \Phi(u_2),u_2\rangle - \langle \Phi(u_1),u_2\rangle^2 = -\langle \Psi(u_1),u_1\rangle \langle \Phi(u_2),u_2\rangle - \langle \Psi(u_1),u_2\rangle^2 = \langle \Psi(u_1),u_1\rangle\langle \Psi(u_2),u_2\rangle -\langle \Psi(u_1),u_2\rangle^2,$$
and so $\langle\Psi(u_2),u_2\rangle = -\langle \Phi(u_2),u_2\rangle$, as they claimed.
Let me reiterate that the lemma is false as stated. If we assume both $\Phi$ and $\Psi$ are positive definite, then we in fact should conclude that $\det(\Phi-\Psi)\le 0$ with equality holding iff $\Phi=\Psi$. If we assume (as apparently these authors meant to) that $\Phi$ is positive definite and $\Psi$ is negative definite, then we conclude that $\det(\Phi+\Psi)\le 0$ with equality holding iff $\Phi=-\Psi$.
If $A=USV^\ast$ is a singular value decomposition of a non-normal traceless $2\times2$ matrix $A$, then $V^\ast U$ must possess a zero diagonal.
Write $-\det(A)$ in polar form as $de^{i\theta}$. By dividing $A$ by $e^{i\theta/2}$ and by a change of orthonormal basis, we may assume that $-\det(A)=d\ge0$ and $V=I$. We want to show that $U$ has a zero diagonal.
Since $A$ has a zero trace, $A^2=dI$. Therefore $USUS=dI$.
If $A$ is singular, then $SUS=0$. Since $A$ is not normal, $S=\operatorname{diag}(s,0)$ for some $s>0$. The equality $SUS=0$ thus implies that $u_{11}=0$. As $U$ is unitary, $u_{22}$ must also be zero. Hence $U$ has a zero diagonal.
If $A$ is nonsingular, then $d>0$. From $USUS=dI$, we get $(USU^\ast)U^2 = \left(dS^{-1}\right)(I)$. By the uniqueness of polar decompositions of nonsingular matrices, we have $U^2=I$. As $U\ne\pm I$ (otherwise $A=\pm S$ is normal), the spectrum of $U$ must be equal to $\{1,-1\}$. Hence the trace of $U$ is zero. If the diagonal of $U$ is nonzero, since $A=US$ also has zero trace, $S$ must be a scalar matrix and $A=US$ is normal, which is a contradiction. Therefore $U$ has a zero diagonal.
Best Answer
The chain of inequalities makes sense if we add in the following.
Proof: Noting that $x$ and $y$ are unit vectors, we compute $$ \|x - y\|^2 = (x-y)^*(x-y) = x^*x - x^*y - y^*x + y^*y \\ = 1 - \langle x, y\rangle - \langle y,x \rangle + 1 = 2(1 - \operatorname{Re}\langle x,y \rangle) $$ So, we have $\|x - y\| = \sqrt{2}\sqrt{1 - \operatorname{Re}\langle x,y \rangle}$.
On the other hand, $A = (xx^* - yy^*)$ is a rank-2 Hermitian matrix with trace zero, which means that it has non-zero eigenvalues $\pm \lambda$ for some $\lambda>0$. We compute $$ \begin{align*} 2\lambda^2 &= \lambda^2 + (-\lambda)^2 = \operatorname{Tr}(A^2) = \operatorname{Tr}[(xx^* - yy^*)^2] \\ & = \operatorname{Tr}[xx^*xx^* - xx^*yy^* - yy^*xx^* + yy^*yy^*] \\ & = \operatorname{Tr}[x^*xx^*x - x^*yy^*x - y^*xx^*y + y^*yy^*y] \\ &= \langle x,x\rangle^2 - 2|\langle x,y\rangle|^2 + \langle y,y \rangle^2 \\ & = 2 - 2|\langle x,y\rangle|^2 \end{align*} $$ We thereby conclude that $\lambda = \sqrt{1 - |\langle x,y \rangle|}$. Thus, we compute $$ \frac 12 \operatorname{Tr}|xx^* - yy^*| = \frac 12 (2 \lambda) = \sqrt{1 - |\langle x,y \rangle|}. $$
Thus, it suffices to show that $$ \sqrt{2}\sqrt{1 - \operatorname{Re}\langle x,y \rangle} \geq \sqrt{1 - |\langle x,y \rangle|}. $$ Indeed, we have $$ \sqrt{2}\sqrt{1 - \operatorname{Re}\langle x,y \rangle} \geq \sqrt{1 - |\langle x,y \rangle|} \iff\\ 2(1 - \operatorname{Re}\langle x,y \rangle) \geq 1 - |\langle x, y \rangle| \iff \\ 1 + |\langle x, y \rangle| \geq 2 \operatorname{Re}\langle x,y \rangle \iff\\ \frac{1 + |\langle x, y \rangle|}{2} \geq \operatorname{Re}\langle x,y \rangle. $$ The last inequality can be shown to hold as follows: by Cauchy Schwarz, $|\langle x, y \rangle| < 1$. So, $$ \frac{1 + |\langle x, y \rangle|}{2} \geq |\langle x,y \rangle| = \sqrt{(\operatorname{Re}\langle x,y \rangle)^2 + (\operatorname{Im} \langle x,y\rangle)^2} \\ \qquad \qquad \quad\geq \sqrt{(\operatorname{Re}\langle x,y \rangle)^2} = |\operatorname{Re}\langle x,y \rangle| \geq \operatorname{Re}\langle x,y \rangle. $$