Hint $ $ Work universally, i.e. consider the matrix entries as indeterminates $\:\!\rm a_{\:\!ij},b_{\:\!ij}.\,$ Adjoin them to $\,\Bbb Z\,$ to get the polynomial ring $\rm R = \mathbb Z[a_{\:\!ij},b_{\:\!ij}].\, $ In this polynomial ring $\rm R,$ compute the determinant of $\rm\, (1+A B) A = A (1+BA)\,$ then cancel the nonzero polynomial $\rm\, det(A)\, $ (valid by $\rm R$ a domain). $ $ Extend to non-square matrices by padding appropriately with $0$'s and $1$'s to get square matrices. Note that the proof is purely algebraic - it does not require any topological notions (e.g. density).
Alternatively we may employ Schur decomposition as follows
$$\rm\left[ \begin{array}{ccc}
1 & \rm A \\
\rm B & 1 \end{array} \right]\, =\, \left[ \begin{array}{ccc}
1 & \rm 0 \\
\rm B & 1 \end{array} \right]\ \left[ \begin{array}{ccc}
1 & \rm 0 \\
\rm 0 & \rm 1\!-\!BA \end{array} \right]\ \left[ \begin{array}{ccc}
1 & \rm A \\
\rm 0 & 1 \end{array} \right]\qquad$$
$$\rm\phantom{\left[ \begin{array}{ccc}
1 & \rm B \\
\rm A & 1 \end{array} \right]}\, =\, \left[ \begin{array}{ccc}
1 & \rm A \\
\rm 0 & 1 \end{array} \right]\ \left[ \begin{array}{ccc}
\rm 1\!-\!AB & \rm 0 \\
\rm 0 & \rm 1 \end{array} \right]\ \left[ \begin{array}{ccc}
1 & \rm 0 \\
\rm B & 1 \end{array} \right]\qquad$$
See this answer for more on universality of polynomial identities, universal cancellation (before evaluation) and closely relation topics, and see also this sci.math thread on 9 Nov 2007.
I recently encountered a way to distinguish these that I thought was rather elegant:
direct product/sum - $A \otimes B$, $A \oplus B$
Kronecker product/sum - $A \hat{\otimes} B$, $A \hat{\oplus} B$
So, for example, $I_3 \hat{\otimes} B = B \oplus B \oplus B$ clearly shows this specific Kronecker product is expressed in terms of direct sums.
However, the hat on the Kronecker product is not strictly necessary, unless there is a desire for consistency with the way the Kronecker and direct sums are distinguished using this method.
Reference:
Chirikjian, G. "Stochastic Models, Information Theory, and Lie Groups, Volume 1: Classical Results and Geometric Methods. Applied and Numerical Harmonic Analysis." (2009)
I found this reference on Google Books when searching for "kronecker sum" vs "direct sum". I had not seen this notation or anything like it in the dozen or so other references I've encountered, so it caught my attention. In most references, it is usually explained in the adjacent text which sum is meant. (Not that you asked, but to be complete I will mention that the products are less important to distinguish, since it is clear from the definitions of the symbols which product is meant. If the symbols represent matrices or vector spaces, then the symbol means Kronecker or direct product, respectively.)
Best Answer
What is to be proved is the following: $$ e^{A \otimes I_b +I_a \otimes B} = e^A \otimes e^B~$$ where $I_a,A \in M_n$ , $ I_b, B \in M_m$
This is true because $$ A \otimes I_b~~~~\text{and}~~~~ I_a \otimes B$$ commute, which can be shown by using the so called mixed-product property of the Kronecker product. i.e. $$ (A \otimes B)\cdot (C \otimes D) = (A\cdot C) \otimes (B\cdot D)~$$ Here, $\cdot$ represents the ordinary matrix product.
One can also show that for an arbitrary matrix function $f$, $$f(A\otimes I_b) = f(A)\otimes I_b~~~~\text{and}~~~ f(I_b \otimes A) = I_b \otimes f(A)~.$$ Together with the commutative property mentioned above, you can prove your result.