[Math] Summing infinitely many infinitesimally small variables makes sense in algebra

ac.commutative-algebrapositive-characteristicreference-request

There is an identity $e^x=\lim_{n\to \infty} (1+x/n)^n$, and I always thought it is a purely analytic statement. But then I discovered its curious interpretation in pure algebra:

Consider the ring of formal infinite sums of monomials in infinitely many variables $\varepsilon_1, \varepsilon_2,\ldots$ satisfying $\varepsilon_i^2=0$.

$$
R=\mathbb{Q}[\![\varepsilon_1, \varepsilon_2, \ldots]\!]/(\varepsilon_i^2: i=1,2,\ldots).
$$

Then the sum $x=\sum_{i=1}^\infty \varepsilon_i$ makes sense and is not infinitesimally small, in fact we have
$$
x^n = n! \sum_{1\leq i_1<i_2<\ldots<i_n} \varepsilon_{i_1} \cdots \varepsilon_{i_n}.
$$

So the ring of polynomials $\mathbb{Q}[x]$ embeds into $R$. Moreover, in $R$ we have the identity
$$
\prod_{i=1}^\infty(1+\varepsilon_i) = \sum_{n=0}^\infty \frac{x^n}{n!}.
$$

So somehow we multiplied infinitely many elements infinitely close to $1$ and managed to get away from $1$ and obtain the right answer.

I was wondering if this is well-known and if there are applications of this idea. For instance, one can probably use it to recover the formal neighborhood of $1$ in an algebraic group from the Lie algebra.

In positive characteristic the right hand side doesn't make sense, but the left hand side still does. In fact, symmetric functions in $\{\varepsilon_i\}$ form a ring with divided power structure. Can one build $p$-adic cohomology theories based on this idea instead of divided power structures?

Best Answer

$\DeclareMathOperator{\ex}{ex}$ One way to look at this is through symmetric functions. To be consistent with standard notation I'll take infinitely many variables $x_1, x_2,\dots$ (instead of $\varepsilon_1, \varepsilon_2,\ldots$). I will follow the presentation in Richard Stanley's Enumerative Combinatorics, Vol. 2, page 304.

There is a homomorphism ex from the ring of (unbounded degree) symmetric functions to power series in $t$ that can be defined by $\ex(p_1) = t$, $\ex(p_n)=0$ for $n>1$, where $p_n$ is the power sum symmetric function $x_1^n+x_2^n+\cdots$. Then $\ex$ is the restriction to symmetric functions of the homomorphism on all formal power series in $x_1,x_2,\dots$ that takes each $x_i^2$ to 0 (where $t$ is the image of $p_1$). It has the property that for any symmetric function $f$, $$\ex(f) = \sum_{n=0}^\infty [x_1x_2\cdots x_n] f \frac{t^n}{n!},$$ where $[x_1x_2\cdots x_n] f$ denotes the coefficient of $x_1x_2\cdots x_n$ in $f$. In particular, $\ex(h_n) = \ex(e_n) = t^n/n!$ where $h_n$ and $e_n$ are the complete and elementary symmetric functions.

This idea is well known and is very useful in enumerative combinatorics. It allows one to derive exponential generating functions for objects with distinct labels (e.g., permutations or standard Young tableaux) from symmetric function generating functions for objects with repeated labels (e.g., words or semistandard tableaux). There are related homomorphisms that preserve more information; see, for example, section 7.8 of Stanley.

Related Question