A few years ago, I did research in quantum mechanics, specifically dealing with generalized displacement operators. In such musings, BCH lights (or gets in, depending on your viewpoint) the way. A question that struck me today was: does $\exp(A+B) = \exp(A)\exp(B)$ hold if and only if $A$ and $B$ commute? Clearly, if they do commute this is true but I have not seen anything detailing the opposite direction. Given the complexity of BCH, I would be inclined to think that it's not simple to prove if it is true. I've thought about it on my own but haven't been able to come to any sort of conclusion one way or the other.
Operator Theory – Understanding $\exp(A+B)$ and Baker-Campbell-Hausdorff
operator-theory
Related Solutions
Using the series definition of exponential:
$$ e^{iG\lambda}A e^{-iG\lambda} = \sum_{p=0}^\infty\frac{(iG\lambda)^p}{p!}A\sum_{q=0}^\infty\frac{(-iG\lambda)^q}{q!} = \sum_{p=0}^\infty\sum_{q=0}^\infty(-)^q\frac{(i\lambda)^{p+q}}{p!q!}G^pAG^q=\\ \sum_{s=0}^\infty\sum_{d=0}^s(-)^d\frac{(i\lambda)^s}{d!(s-d)!}G^{s-d}AG^d=\\ A+i\lambda[G,A]+\frac{(i\lambda)^2}{2!}[G,[G,A]]+\ldots+\frac{(i\lambda)^n}{n!}\sum_{k=0}^n(-)^k \binom{n}{k}G^{n-k}AG^k+\ldots $$ So we are left with the following relation which we have to verify, and which would prove the statement: $$ \mathscr{F}(n): \sum_{k=0}^n(-)^k \binom{n}{k}G^{n-k}AG^k=\underbrace{[G,[G,[G,\ldots[G}_{n\ times},A]]]\ldots]. $$ Proceeding by induction, since the first terms shown above are compatible with the formula, we have to show that, if $\mathscr{F}$(n) holds then $\mathscr{F}$(n+1) is true as well.
To do this we exploit: $$ \underbrace{[G,[G,[G,\ldots[G}_{n+1\ times},A]]]]\ldots] = \underbrace{[G,[G,[G,\ldots[G}_{n\ times},[G,A]]]\ldots] $$
Then substituting $\mathscr{F}(n)$ yields: $$ \underbrace{[G,[G,[G,\ldots[G}_{n+1\ times},A]]]]\ldots]= \sum_{k=0}^n(-)^k \binom{n}{k}G^{n-k}(GA-AG)G^k =\\ \sum_{k=0}^n(-)^k \binom{n}{k}G^{n+1-k}AG^{k}-\sum_{k=0}^n(-)^k \binom{n}{k}G^{n-k}AG^{k+1}=\\ G^{n+1}A+\sum_{k=1}^n(-)^k \binom{n}{k}G^{n+1-k}AG^{k}-\sum_{k'=1}^{n}(-)^{k'-1} \binom{n}{k'-1}G^{n+1-k'}AG^{k'}+(-)^{n+1}AG^{n+1} $$ where in the last passage we changed summing index in the second sum, and took out the first term from the first and the last from the second. Now: $$ \binom{n}{k}+\binom{n}{k-1} = \binom{n+1}{k} $$ which gives $$ \ldots=G^{n+1}A + \sum_{k=1}^n(-)^k \binom{n+1}{k}G^{n+1-k}AG^{k} + (-)^{n+1}AG^{n+1}= \sum_{k=0}^{n+1}(-)^k \binom{n+1}{k}G^{n+1-k}AG^{k}.$$
And therefore $\mathscr{F}$(n+1) holds.
Hint: apply the spectral theorem. This will allow you to show that the log exists and that when it is diagonalized, is pure imaginary.
In response to a comment, here is how the spectral theorem allows us to show that a logrithm of $U$ exists.
There are two ways to define the logrithm of an operator $A$, either with a power series (in which case you have it defined when $\|I-A\|<1$, but have defined it uniquely), or just as an operator $B$ such that $e^{B}=A$. Even for the one dimensional Hilbert space, there is no $\log 0$, and taking the second definition, $\log$ is not a single valued function where it exists. In higher dimensions, it is easy to find the log of a diagonalized operator: as long as all of the eigenvalues of $A$ are non-zero, we define $\log A$ on eigenvectors by $Ax=\lambda x \Rightarrow (\log A) x = (\log \lambda)x$. This is what I meant by being able to show that $\log$ exists in my hint. The reason I mention needing existance is because things can get more complicated.
In the finite dimensional case, we can actually compute $\log A$ for all $A\in GL_n(\mathbb C)$ as follows. First, we observe that matrix exponentials commute with conjugation, and the exponential of a block diagonal matrix is still block diagonal, and so it suffices to take the logrithm of Jordan blocks. Writing an invertible Jordan block as $\lambda (I+N)$ where $N$ is nilpotent, we can take $(\log \lambda)I + \log(I+N)$, where the second term in the sum exists because the power series defining $\log$ will be a finite sum.
However, in the infinite dimensional case, it isn't immediately clear that invertability is enough. Even if we could write an operator as a direct sum of operators of the form $\lambda I + N$ where $N$ was locally nilpotent so that we could use the above proof to show that $\log (I+N)$ existed as a linear map, it's not clear to me that it would necessarily be bounded. Furthermore, we can get more complicated maps than that.
As an example of the weird behavior you can get, let us try to find the log of $I-L$ and $I-R$ where $L$ and $R$ are the left and right shift maps on sequences. If we use power series, we are trying to compute $\sum \frac{L^n}{n}$ and $\sum \frac{R^n}{n}$. To get a feel for what is going on, we can consider these as maps on $\mathbb R^{\mathbb N}$, $\ell^{\infty}$, and then finally on $\ell^2$, the case we care about. On $\mathbb R^{\mathbb N}$, the sum for $\log(I-L)$ need not converge when evaluated on a sequence, and on $\ell^{\infty}$, neither $\log(I-L)$ nor $\log(I-R)$ are bounded. Do they become bounded on $\ell^2$? I honestly don't know (though if someone does, please comment, especially if there is an easy argument). I can only imagine that the general case is even more complicated.
Best Answer
Take $A=\begin{pmatrix}i\pi & 0 \\ 0 & - i\pi\end{pmatrix}$ and $B=\begin{pmatrix}i\pi & 1 \\ 0 & - i\pi\end{pmatrix}$. Then $A$ and $B$ do not commute and $\exp(A)\,\exp(B)=\exp(A+B)$.