[Math] Show that a unitary operator is of the form exp(iA)

operator-theory

This is an exercise from chapter 2 in Conway's "A Course in Operator Theory":

Show that every unitary operator on a Hilbert space can be written as $U=\exp(iA)$ for some Hermitian $A$.

I tried to use $A=(U+U^*)/2$ but that didn't seem to be going anywhere.
I'll appreciate any help.

Best Answer

Hint: apply the spectral theorem. This will allow you to show that the log exists and that when it is diagonalized, is pure imaginary.


In response to a comment, here is how the spectral theorem allows us to show that a logrithm of $U$ exists.

There are two ways to define the logrithm of an operator $A$, either with a power series (in which case you have it defined when $\|I-A\|<1$, but have defined it uniquely), or just as an operator $B$ such that $e^{B}=A$. Even for the one dimensional Hilbert space, there is no $\log 0$, and taking the second definition, $\log$ is not a single valued function where it exists. In higher dimensions, it is easy to find the log of a diagonalized operator: as long as all of the eigenvalues of $A$ are non-zero, we define $\log A$ on eigenvectors by $Ax=\lambda x \Rightarrow (\log A) x = (\log \lambda)x$. This is what I meant by being able to show that $\log$ exists in my hint. The reason I mention needing existance is because things can get more complicated.

In the finite dimensional case, we can actually compute $\log A$ for all $A\in GL_n(\mathbb C)$ as follows. First, we observe that matrix exponentials commute with conjugation, and the exponential of a block diagonal matrix is still block diagonal, and so it suffices to take the logrithm of Jordan blocks. Writing an invertible Jordan block as $\lambda (I+N)$ where $N$ is nilpotent, we can take $(\log \lambda)I + \log(I+N)$, where the second term in the sum exists because the power series defining $\log$ will be a finite sum.

However, in the infinite dimensional case, it isn't immediately clear that invertability is enough. Even if we could write an operator as a direct sum of operators of the form $\lambda I + N$ where $N$ was locally nilpotent so that we could use the above proof to show that $\log (I+N)$ existed as a linear map, it's not clear to me that it would necessarily be bounded. Furthermore, we can get more complicated maps than that.

As an example of the weird behavior you can get, let us try to find the log of $I-L$ and $I-R$ where $L$ and $R$ are the left and right shift maps on sequences. If we use power series, we are trying to compute $\sum \frac{L^n}{n}$ and $\sum \frac{R^n}{n}$. To get a feel for what is going on, we can consider these as maps on $\mathbb R^{\mathbb N}$, $\ell^{\infty}$, and then finally on $\ell^2$, the case we care about. On $\mathbb R^{\mathbb N}$, the sum for $\log(I-L)$ need not converge when evaluated on a sequence, and on $\ell^{\infty}$, neither $\log(I-L)$ nor $\log(I-R)$ are bounded. Do they become bounded on $\ell^2$? I honestly don't know (though if someone does, please comment, especially if there is an easy argument). I can only imagine that the general case is even more complicated.

Related Question