Conditional expectation on Cross product von Neumann algebra

von-neumann-algebras

Let $M$ be a von Neumann algebra on a Hilbert space $\mathcal{H}$. Let $(M,G,\alpha)$ be a $W^*$-dynamical system. Let $G$ be discrete. Then the cross product von Neumann algebra $M\rtimes_\alpha G$ is defined in the Hilbert space $l^2(G,\mathcal{H})$. Now since $G$ is discrete we can write $l^2(G,\mathcal{H})=\oplus_{g \in G} \mathcal{H}\otimes \epsilon_g $ where $\epsilon_g$ is defined as $\epsilon_g(h)=\delta_{h,g}$. Using this identification we can write any element of $\mathcal{B}(l^2(G,\mathcal{H}))$ as matrix of operators on $H$.

Now let $P_g$ be the orthogonal projection onto the subspace $H\otimes \epsilon_g$ of $l^2(G,\mathcal{H})$. Then I saw a result (link given below), that the map $\phi: M\rtimes G \rightarrow M$ defined by $\phi(T)=\sum_gP_gTP_g$ where the sum is in the SOT topology is a conditional expectation. Interms of matrix representation of $M\rtimes G \subset \mathcal{B}(l^2(G,\mathcal{H}))$, I guess that the map $\phi$ is mapping a matrix to its diagonal? Is it true? How to prove it? Why does the sum infact belong to $M$?

https://www.jstor.org/stable/pdf/2373237.pdf

https://projecteuclid.org/download/pdf_1/euclid.tmj/1178241528

Best Answer

You never mention how you define $M\rtimes_\alpha G$. In the direct sum picture, one defines, using the matrix units $\{E_{gh}:\ g,h\in G\}$, $$ M\rtimes_\alpha G=\Bigg[\Big\{\sum_{g\in G}\alpha_g(x)\,E_{gg}:\ x\in M\Big\}\cup\Big\{\sum_{h\in G}E_{h,g^{-1}h}:\ g\in G\Big\}\Bigg]'', $$ where $\hat x=\sum_{g\in G}\alpha_g(x)\,E_{gg}$ is how we see $M$ inside $M\rtimes_\alpha G$. The elements $u_g=\sum_{h\in G}E_{h,g^{-1}h}$ are unitaries, and the algebra $\hat M$ of elements $\sum_{g\in G}\alpha_g(x)\,E_{gg}$ is invariant under conjugation by each $u_g$.

The expectation is indeed "compression to the diagonal" in a block sense. And the "diagonal" is $\hat M$. In this notation, the projections are $E_{gg}$. Then $$ \sum_{g\in G} E_{gg}\Big(\sum_{s\in G}\alpha_s(x)\,E_{ss}+\sum_{r\in G\setminus\{e\}} c_{r}\sum_{t\in G}E_{r^{-1}t,t}\Big)E_{gg}=\sum_{s\in G}\alpha_s(x)\,E_{ss}\in\hat M. $$


To see that $\phi$ is normal, suppose that $T_\alpha\nearrow T$, selfadjoints. Given a vector $x=\sum_g x_g$, where $x_g=P_g x$, \begin{align} \|\phi(T-T_\alpha)x\|^2 &=\Big\|\sum_gP_g(T-T_\alpha)P_g\sum_hx_h\Big\|^2\\[0.2cm] &=\Big\|\sum_gP_g(T-T_\alpha)P_g x_g\Big\|^2\\[0.2cm] &=\sum_g\|P_g(T-T_\alpha) x_g\|^2\\[0.2cm] &\leq\sum_g\|(T-T_\alpha) x_g\|^2\\[0.2cm] \end{align} Now because $T-T_\alpha$ is uniformly bounded and $\sum_g\|x_g\|^2<\infty$, given $\varepsilon>0$ there exists $F\subset G$, finite, such that $\sum_{g\in G\setminus F}\|(T-T_\alpha)x_g\|^2<\varepsilon$. Then $$ \limsup_\alpha\|\phi(T-T_\alpha)x\|^2 \leq \varepsilon+\limsup_\alpha\sum_{g\in F}\|(T-T_\alpha)x_g\|^2=\varepsilon. $$ This shows that $\phi(T_\alpha)\to\phi(T)$ sot, and so $\phi$ is normal.

Related Question