[Physics] Kraus operator rank

homework-and-exerciseslinear algebraoperatorsquantum-information

All quantum operations $\mathcal{E}$ on a system of Hilbert space dimension $\mathcal{d}$ can
be generated by an operator-sum representation containing at most $\mathcal{d^2}$ elements. Extending further, an operation from space with dimension $\mathcal{m}$ to space with dimension $\mathcal{n}$ has an operator sum representation in terms of Kraus operators. Refer: http://en.wikipedia.org/wiki/Quantum_operation#Kraus_operators

The follow up proof is in terms of an exercise in Nielsen and Chuang's book, exercise 8.10
enter image description here

Proving the first part is easy enough… $ W_{jk} = W_{kj}^{*} $ if you expand the definition of $ W_{jk}$ and using properties of transpose conjugate. So W is in fact Hermitian. That it is of rank at most $ d^2 $ is what I'm not able to prove.

I read on wikipedia about the properties of rank of a matrix, and it says that a matrix of rank M can be expressed as the sum of M rank-1 matrices. In that case I would need to prove that the individual terms in the sum:

$ \sum_{j,k} tr(E_j^{+}E_k)\left|j\right \rangle \left\langle k \right| $

are of rank 1. Since $tr(E_j^{+}E_k) $ is in fact a scalar, and $ \left|j\right \rangle$ & $\left\langle k \right| $ are the eigenbasis for the input and output spaces, there can be d such terms of $ \left|j\right \rangle$ and $ \left|k\right \rangle$ respectively, the product of which forms a rank-1 matrix.

Hence there would be at max d*d such terms, if all the terms $tr(E_j^{+}E_k) $ are non-zero.

Is that the correct proof? Am I making a mistake somewhere?

The number of such terms is also called the Kraus rank as given in:

Best Answer

Since there still isn't an answer but the question has attracted a few upvotes, let me elaborate on my comment. This is more maths, than physics, but anyway.

Writing $\sum_{jk} \mathrm{tr}(E_j^{\dagger}E_i)|j\rangle\langle k|$ doesn't give you anything. This is indeed a rank-one decomposition, but the theorem does not tell you that ANY rank-1 decomposition has at most d^2 terms. This would be true, if $|j\rangle$ was the eigenbasis of $\mathbb{C}^d$ - but it is not. $W$ is an $M\times M$ matrix, hence $|j\rangle$ is the eigenbasis of $\mathbb{C}^M$ and $M$ could be much larger than this.

However, what is true is that $E_j\in\mathbb{C}^{d\times d}$. The key observation is that at most $d^2$ of these $E_j$ can therefore be linearly independent and this entails that $W$ can only be of rank at most $d^2$. Here is a proof (albeit not that pretty):

There is a basis of $\mathbb{C}^{d\times d}$ with $d^2$ elements, call it $F_j$, which is orthonormal with respect to the trace inner-product. Now, since the $F_j$ form a basis, for every $E_j$ we have:

$$ E_j=\sum_i a_i^{(j)} F_i \qquad a_i^{(j)}\in\mathbb{C}\quad \forall\,j\in \{1,\ldots,N\}$$

the $E_j$ are linear combinations of the $F_j$. But then, we can reexpress the columns of W by $F_i$ and obtain:

$$ \operatorname{tr}(E_i^{\dagger}E_j)=\sum_{k=1}^{d^2} {a_k^{(i)}}^* a_k^{(j)} $$

By definition of a basis, only $d^2$ of the $E_i$ can be linearly independent. Without loss of generality, we suppose the first $d^2$ $E_i$ were linearly independent. Let's then have a look at the $(d^2+1)$th column. Since $E_{d^2+1}$ is linearly dependent, its coefficients $a_k^{(d^2+1)}$ are linear combinations of the other $a^{(j)}$, say

$$ a^{(d^2+1)}_k=\sum_{j=1}^{d^2} b_ja_k^{(j)}$$

But then, we can see that

$$ \operatorname{tr}(E_i^{\dagger}E_{d^2+1})=\sum_j b_j \operatorname{tr}(E_i^{\dagger}E_j) $$

hence the whole column is a linear combination of the former columns. Putting everything together, $W$ can have at most $d^2$ linearly independent columns. We can then diagonalize $W$, since it is Hermitian and proceed as indicated.

Related Question