Commutativity in Binomial Theorem

binomial theoremlinear algebramatricesmatrix exponential

I have a question in a math exam that is asking us to prove that for matrices $A$ and $B$ that
$e^{A+B}=e^Ae^B$. Our professor has asked us to prove this and has been emphatic about "having to show exactly where commutativity is required". I want to double check that I have given the right answer.

I found a way to prove $e^{A+B}=e^Ae^B$ using the power series of $e^{A+B}$ and getting the binomial theorem out of it i.e. you end up with $$\sum_{n=0}^\infty \frac{1}{n!}\sum_{k=0}^n \binom{n}{k}A^nB^{n-k} $$ and that second sum is just the binomial theorem. In there by the principle of multiplication we have to be able to choose A or B anyway we want. And there is where commutativity is required. I don't need to explain the rest of the proof I know it is correct but I'm pretty sure this step in particular is what my proff is referring to as "where commutativity is required".

Best Answer

Hint: By definition of matrix' multiplication and its properties one gets $(A+B)^2=A^2+AB+BA+B^2$ which unless $AB=BA$ that is not equals to $A^2+2AB+B^2$. And with respect to exponentiation $e^{A+B}\neq e^Ae^B$ in general. If $AB=BA$ then $e^{A+B}=e^Ae^B$.

You can verify this easily with $2\times 2$ matrices chosen at random.

Related Question