How do you prove the adjoint of a linear operator in any abstract inner product space is related to the conjugate transpose of its matrix

adjoint-operatorsdual-spaceslinear algebratranspose

If you define the inner product $\langle u,v\rangle=u^Tv$, then the fact that $\langle Au,v\rangle=\langle u,A^T\rangle$ where $A$ is a matrix defined over $\Bbb{R}$ is easily seen. Likewise, if you define a complex inner product space such that $\langle u,v\rangle=u^*v$ then $\langle Au,v\rangle=\langle u,A^*v\rangle$ is also easily seen, again considering $A$ as a matrix and the $^*$ as denoting the action of transposing a matrix's rows and columns, and conjugating them.

I know that in the abstract theory of linear operators over any inner product space, $T:V\to W$ has $\forall u,v\in V, \langle Tu,v\rangle_W=\langle u,T^*v\rangle_V$ where $T^*$ is the adjoint operator defined uniquely in this way. $T^T:W^*\to V^*$ is the transpose of $T$ if $\forall w\in W^*,T^T\circ w=w\circ T$. I use $W^*,V^*$ to denote the dual space.

I know that there is a way to prove that if you have a finite dimensional $V,W$, then if you choose to represent $T$'s action on $V$ with some matrix $A$, then the representation of $T^T$ is precisely $A^T$, where $A^T$ is the much more familiar (to me) idea of interchanging rows and columns. I also know that if a matrix is real, $A^T$ and $A^*$ are identical.

My question:

How does one reconcile the manual, row-swapping definitions of $A^T,A^*$ with the abstract definitions of $T^T,T^*$ (assuming $A$ is the matrix representing $T$) to prove that $A^T=A^*$ (in the row-swapping and conjugating sense) if $T$ is defined between vector spaces defined over $\Bbb{R}$? And does this mean if you have any general abstract linear operator (such as one over a vector space of polynomials) $T:V(\Bbb{R})\to W(\Bbb{R})$, then the abstract transpose and adjoint are equal: $T^T=T^*$? Indeed, another way of asking this question: I know it is possible to show $T^T$ has representation $A^T$, but how does one show that $T^*$ has representation $A^*$?

Best Answer

Assume $V$ and $W$ are finite dimensional inner product spaces over $\mathbb{C}$ (or $\mathbb{R}$, but I will use $\mathbb{C}$ for definiteness) and $T \in L(V, W)$. We want to find the matrix representation of $T^* \in L(W, V)$. To do this, consider arbitrary orthonormal bases $B_1 = \{v_1, \dots, v_n\}$ of $V$, $B_2 = \{w_1, \dots, w_m\}$ of $W$. Let $A = M_{B_1}^{B_2}(T)$, the matrix representation of $T$ with respect to the bases $B_1, B_2$. With $(\cdot, \cdot)$ denoting inner products, we have $$Tv_j = \sum_{i = 1}^{m}(Tv_j, w_i)w_i$$ since $B_2$ is an orthonormal basis of $W$. Thus $a_{ij} = (Tv_j, w_i)$. Let $B = M_{B_2}^{B_1}(T^*)$ denote the matrix representation of $T^*$ with respect to the bases $B_2$, $B_1$. Since $B_1$ is an orthonormal basis of $V$, we have $$T^*w_j = \sum_{i = 1}^{n}(T^*w_j, v_i)v_i.$$ Thus $b_{ij} = (T^*w_j, v_i)$. Thus $$b_{ij} = (T^*w_j, v_i) = (w_j, Tv_i) = \overline{(Tv_i, w_j)} = \overline{a_{ji}}.$$ So $B$ is the conjugate transpose of $A$, provided $B_1$ and $B_2$ are orthonormal. If $B_1$ and $B_2$ are not orthonormal, then $B$ need not be the conjugate transpose of $A$. The above argument also works to show that the adjoint exists.

Edit: For a finite-dimensional inner product space $V$, an orthonormal basis always exists. This is because by definition of finite-dimensional, $V$ has a basis $\{v_1, \dots, v_n\}$, and then you can turn it into an orthonormal basis $\{e_1, \dots, e_n\}$ using the Gram-Schmidt procedure. "Infinite dimensional" vector spaces are ones that aren't finite dimensional, i.e. they don't have a finite basis. The study of infinite dimensional spaces requires significantly more analysis than finite-dimensional spaces, and not all results from finite-dimensional spaces carry over to infinite dimensions.