[Math] Properties of the per-element exponential (Hadamard exponential) for matrices

eigenvalues-eigenvectorsexponential functionlinear algebra

I'm asking this question mostly out of curiosity, though I do also have a potential application.

In linear algebra we usually define the matrix emponential as $e^A = I + A + \frac{1}{2}A^2 + \frac{1}{6}A^3 + \dots$, which has lots of nice properties. However, we could also define a different kind of "matrix exponentiation", which I'll write $e^{\circ A}$, where $(e^{\circ A})_{ij} = e^{A_{ij}}$, i.e. we just apply the exponential function to each element independently.

After writing this question I guessed that the name of this operation would be "Hadamard exponential." An internet search revealed that it's mentioned by this name in a few textbooks and research papers, but in general I can find very little written about its properties from a linear algebra point of view. (I've edited this post to use what seems to be standard notation for the Hadamard exponential.)

One obvious thing is that it inherits all the usual properties of exponentiation, as long as we use the Hadamrd product $(\circ)$ (i.e. per-element multiplication) instead of the usual matrix product. Then we can immediately apply results like the Schur product theorem to conclude that if $e^{\circ A}$ and $e^{\circ B}$ are both positive definite then so is $e^{\circ (A+B)}$. Another obvious property is that for real matrices, the elements of $e^{\circ A}$ are all positive, and hence the Perron-Frobenius theorem applies.

However, what I would particularly like to know is whether anything can be said about the eigenvalues and eigenvectors of $e^{\circ A}$ in terms of the eigendecomposition of $A$. I suspect that there is no straightforward relationship in general, but I would expect there to be inequality constraints.

In short, my question is, has the operation I've called $\operatorname{eexp}$ been studied in linear algebra, and what is known about its properties?

Best Answer

There are some wonderful theorems regarding entrywise functions of matrices, especially regarding positive definite matrices. In particular:

  • If $A,B$ are positive semidefinite, so is $A \circ B$
  • If $A$ is positive semidefinite, then so is $e^{\circ A}$.
  • Define $f[A]$ to be an entrywise function of (real) square matrices (of arbitrary size). Then $f$ takes positive definite matrices to positive definite matrices if and only if it is an analytic function whose power series has only non-negative coefficients.

These results are apparently important in the context of numerical analysis, especially when it comes to thresholding (i.e. rounding values to zero while keeping the resulting error to within certain bounds). See also this question on MO.

Another quick result is, if $\|\cdot\|$ denotes the Frobenius norm, then $$ \|A\circ B\| \leq \|A\| \cdot \|B\| \\ \left\| e^{\circ A}\right\| \leq e^{\|A\|} $$ One last quick and useful result: if $u,v$ are column vectors, then $$ A \circ (uv^T) = \operatorname{diag}(u) A \operatorname{diag}(v) $$

Related Question