A simple proof of this equality: $\det_{\mu\nu}\left(\frac 1 2 \text{Tr}\left[A\sigma_\mu A^\dagger \sigma_\nu \right]\right)=1$

linear algebramathematical physicsphysics

Question

Let $A\in \text{SL}(2,\mathbb{C})$, so $\det(A)=1$. Define the following (Pauli) matrices:

$$\begin{align}
\sigma_0=\begin{pmatrix}-1 & 0 \\ 0 & -1 \end{pmatrix} & &\sigma_1=\begin{pmatrix}0 & 1 \\ 1 & 0 \end{pmatrix} & \\
&\\
\sigma_2=\begin{pmatrix}0 & -i \\ i & 0 \end{pmatrix} & &\sigma_3=\begin{pmatrix}1 & 0 \\ 0 & -1 \end{pmatrix} &
\end{align}
$$

Now define the following $4\times 4$ matrix.

$$L_{\mu\nu}\equiv \frac{1}{2}\text{Tr}\left[A\sigma_\mu A^\dagger \sigma_\nu\right]$$

What I am trying to prove is $\det (L)=1$, that's it. But I am having so much trouble. I have verified that it is true via Mathematica. The following is know exactly:

$$\det (L)=\left|\det(A)\right|^4=1$$

Actually, it would be sufficient for my current purposes to show $\det(L)\geq 0$, but even that is very hard for me to show.


Mathematica Code.

ClearAll[s0, s1, s2, s3, s, A]; (* Define the Pauli-Matrices and A \
matrix *)
s0 = {{-1, 0}, {0, -1}};
s1 = {{0, 1}, {1, 0}};
s2 = {{0, -I}, {I, 0}};
s3 = {{1, 0}, {0, -1}};
s = {s0, s1, s2, s3};
A = {{a, b}, {c, d}};

L00 = 1/2*
  Tr[A.s0.A\[ConjugateTranspose].s0];  (* Define L(A) through the \
equation LSubscript[(A)^\[Mu], \[Nu]] = 1/2Tr[Subscript[A\[Sigma], \
\[Mu]]A\[ConjugateTranspose]Subscript[\[Sigma], \[Nu]]] *)
L01 = 
 1/2*Tr[A.s0.A\[ConjugateTranspose].s1];
L02 = 1/2*Tr[A.s0.A\[ConjugateTranspose].s2];
L03 = 1/2*Tr[A.s0.A\[ConjugateTranspose].s3];
L10 = 1/2*Tr[A.s1.A\[ConjugateTranspose].s0];
L11 = 1/2*Tr[A.s1.A\[ConjugateTranspose].s1];
L12 = 1/2*Tr[A.s1.A\[ConjugateTranspose].s2];
L13 = 1/2*Tr[A.s1.A\[ConjugateTranspose].s3];
L20 = 1/2*Tr[A.s2.A\[ConjugateTranspose].s0];
L21 = 1/2*Tr[A.s2.A\[ConjugateTranspose].s1];
L22 = 1/2*Tr[A.s2.A\[ConjugateTranspose].s2];
L23 = 1/2*Tr[A.s2.A\[ConjugateTranspose].s3];
L30 = 1/2*Tr[A.s3.A\[ConjugateTranspose].s0];
L31 = 1/2*Tr[A.s3.A\[ConjugateTranspose].s1];
L32 = 1/2*Tr[A.s3.A\[ConjugateTranspose].s2];
L33 = 1/2*Tr[A.s3.A\[ConjugateTranspose].s3];
L = {
   {L00, L01, L02, L03},
   {L10, L11, L12, L13},
   {L20, L21, L22, L23},
   {L30, L31, L32, L33}
   };

TraditionalForm[
 FullSimplify[
  Det[L]]]  (* Evaluate the determinant explicitly, and put it in \
legible form *)

Best Answer

This can be proven via vectorization. We define the vectorization of a 2-by-2 matrix as the vector made by stacking columns:

$$ \mathrm{vec}(A) = \left(\begin{matrix}A_{00}\\A_{10}\\A_{01}\\A_{11}\end{matrix}\right), $$

which maps any 2-by-2 complex matrix to a vector in $\mathbb{C}^4$. It also maps the Frobenius inner product into a dot product,

$$ \langle A, B\rangle = \mathrm{tr}(A^\dagger B) = \sum_{i,j=1}^2 A^*_{ij}B_{ij} = \mathrm{vec}(A)^\dagger \mathrm{vec}(B),$$

and there is a nice relationship between the vectorization of a product of matrices and the Kronecker product. For complex $n$-by-$n$ matrices $A,B,C$:

$$ \mathrm{vec}(ABC) = \left(C^\mathrm{T}\otimes A\right) \mathrm{vec}(B). $$

Starting from the definition of $L$ (with $\tau_\mu = \sigma_\mu/\sqrt{2}$ such that $\langle \tau_\alpha, \tau_\beta\rangle = \delta_{\alpha\beta}$):

\begin{align*} L_{\mu\nu} &= \mathrm{tr}(A\tau_\mu A^\dagger \tau_\nu),\\ &= \langle \tau_\mu, A^\dagger\tau_\nu A \rangle,\\ &= \mathrm{vec}(\tau_\mu)^\dagger \mathrm{vec}\left(A^\dagger\tau_\nu A\right),\\ &= \mathrm{vec}(\tau_\mu)^\dagger\left(A^\mathrm{T}\otimes A^\dagger\right)\mathrm{vec}\left(\tau_\nu\right).\end{align*}

This shows that $L_{\alpha\beta}$ are the coefficients of $A^\mathrm{T}\otimes A^\dagger$ in the (orthonormal) basis given by ${\mathrm{vec}(\tau_\alpha)}$. The determinant is therefore given by:

\begin{align*} \mathrm{det}(L) &= \mathrm{det}\left(A^\mathrm{T}\otimes A^\dagger\right) = \mathrm{det}(A^\mathrm{T})^2\mathrm{det}(A^\dagger)^2 = |\mathrm{det}(A)|^4. \end{align*}

Where we have used $\mathrm{det}(A\otimes B) = \mathrm{det}(A)^n\mathrm{det}(B)^m$ with $A$ a $m$-by-$m$ matrix and $B$ a $n$-by-$n$ matrix.