[Math] The ring of upper triangular matrices is isomorphic to the ring of lower triangular matrices.

abstract-algebramatricesring-theory

Let $R=M(2,\mathbb Q)$, the ring of all $2\times 2$ matrices with rational entries.

I have a function $f:A \rightarrow B$, where $A$ is the subring of upper triangular matrices and $B$ is the subring of lower triangular matrices.

So $A$ is the set of all $2\times 2$ upper triangular matrices, i.e. $$x = \begin{pmatrix} a & b \\ 0 & c \\ \end{pmatrix}$$
and $B$ is the set of all $2\times 2$ lower triangular matrices, i.e. $$y = \begin{pmatrix} d & 0 \\ e & f \\ \end{pmatrix}.$$

I need to show that the function $f$ is a ring homomorphism and it is bijective, but I cannot seem to be able to. I'm trying to show that $f(xy)=f(x)f(y)$ but I cannot seem to get it to work.

Any help would be great.

Best Answer

The point is, you have not given your intended map $f$.

I will give it for you $$ f \left( \begin{bmatrix} a & c \\ 0 & b \\ \end{bmatrix} \right) = \begin{bmatrix} b & 0 \\ c & a \\ \end{bmatrix}. $$ Now just compute to see $f(x y) = f(x) f(y)$.

Alternatively, save some time and effort by noting that $$ f \left( \begin{bmatrix} a & c \\ 0 & b \\ \end{bmatrix} \right) = \begin{bmatrix} 0 & 1 \\ 1 & 0 \\ \end{bmatrix} \cdot \begin{bmatrix} a & c \\ 0 & b \\ \end{bmatrix} \cdot \begin{bmatrix} 0 & 1 \\ 1 & 0 \\ \end{bmatrix} $$ and $$ \begin{bmatrix} 0 & 1 \\ 1 & 0 \\ \end{bmatrix}^{2} = I. $$


Explicitly, $$f \left( \begin{bmatrix} a & c \\ 0 & b \\ \end{bmatrix} \right) f \left( \begin{bmatrix} d & e \\ 0 & f \\ \end{bmatrix} \right) = \begin{bmatrix} 0 & 1 \\ 1 & 0 \\ \end{bmatrix} \cdot \begin{bmatrix} a & c \\ 0 & b \\ \end{bmatrix} \cdot \begin{bmatrix} 0 & 1 \\ 1 & 0 \\ \end{bmatrix} \cdot \begin{bmatrix} 0 & 1 \\ 1 & 0 \\ \end{bmatrix} \cdot \begin{bmatrix} d & e \\ 0 & f \\ \end{bmatrix} \cdot \begin{bmatrix} 0 & 1 \\ 1 & 0 \\ \end{bmatrix} =\\= \begin{bmatrix} 0 & 1 \\ 1 & 0 \\ \end{bmatrix} \cdot \begin{bmatrix} a & c \\ 0 & b \\ \end{bmatrix} \cdot\begin{bmatrix} d & e \\ 0 & f \\ \end{bmatrix} \cdot \begin{bmatrix} 0 & 1 \\ 1 & 0 \\ \end{bmatrix} = f \left( \begin{bmatrix} a & c \\ 0 & b \\ \end{bmatrix} \cdot \begin{bmatrix} d & e \\ 0 & f \\ \end{bmatrix}\right).$$