Proof using taylor series

abstract-algebralinear algebraquantum mechanicstaylor expansion

I am currently trying to solve a quantum mechanics problem in which i need to prove that $e^A e^{-A} = 1$ where $A$ is an operator and the exponent function is defined by a taylor series. However, I am having trouble trying to prove this fact using taylor series because I dont think it would be valid to just add the exponents and get 1. I started by trying to prove $e^x e^{-x} = 1$ using taylor expansion and not exponent properties. However I am stuck at $\sum_{n=0}^{\infty} \sum_{m=0}^{\infty}\frac{x^{m+n}}{m!n!} (-1)^m$. How to proceed from here?

Best Answer

Actually the holomorphic functional calculus guarantees that there's no trouble with just adding the exponents, at least as long as $A$ is a bounded operator; more generally if $f$ is any holomorphic function we can make sense of $f(A)$ for $A$ a bounded operator (by applying the power series), and we have $f(A) g(A) = h(A)$ where $f(z) g(z) = h(z)$ as holomorphic functions.

Without using this we can argue as follows, again assuming $A$ is bounded. Consider

$$f(t) = e^{tA} e^{-tA}$$

where $t \in \mathbb{R}$ is a real parameter. It's not hard to show that functions from $\mathbb{R}$ to a Banach algebra satisfy all the usual calculus properties such as the product rule (being careful about noncommutativity) and so forth, and it's not hard to show using the power series definition that $\frac{d}{dt} e^{tA} = A e^{tA}$ (and this property, together with the initial condition $e^0 = 1$, uniquely characterizes $e^{tA}$), so

$$\frac{df}{dt} = A e^{tA} e^{-tA} + e^{tA} (-A) e^{-tA} = 0$$

so $f$ is a constant, and $f(0) = 1$ gives $f(t) = 1$ identically. This argument can be generalized to show that $e^A e^B = e^{A+B}$ whenever $A, B$ commute, by considering the derivative of $e^{tA} e^{tB} e^{-t(A+B)}$.

If you really want to do this just using power series, the identity you want is equivalent to proving that for every $k$ we have

$$\sum_{m+n=k} {k \choose m} (-1)^m = \begin{cases} 0 \text{ if } k \ge 1 \\ 1 \text{ if } k = 0 \end{cases}$$

which is an easy combinatorial identity, following for example from inclusion-exclusion, or from the binomial theorem applied to $(1 - 1)^k$.

Related Question