Combinatorics – Stirling Numbers and Inverse Matrices

combinatoricsmatricesstirling-numbers

Let $s(m,n)$ be the Stirling Numbers of the first kind, $S(m,n)$ be the Stirling Numbers of the second kind.

The matrices $$\mathcal{S}_N := (S(m,n))_{N \geq m,n \geq 0} \text { and } \mathcal{s}_N := (s(m,n))_{N \geq m,n \geq 0}$$
are inverse to each other.

This is a definition in our lecture notes. But why does this apply?

Wikipedia says:

$$\sum_{n=0}^{\max\{j,k\}} (-1)^{n-k} \left[{n\atop j}\right] \left\{{k\atop n}\right\} = \delta_{jk} = \sum_{n=0}^{\max\{j,k\}} (-1)^{n-k} \left\{{n\atop j}\right\} \left[{k\atop n}\right]$$

where $\delta_{jk}$ is the Kronecker delta. But this is neither an explanation (I understand) nor did our lecture cover that "Kronecker Delta".

Could anyone please explain to me?

Thank you in advance!

Best Answer

By definition (at least the one I'm used to), the Stirling numbers of the first kind $S_1(n,k)$ satisfy $$ (x)_n = \sum_{k=0}^n S_1(n,k) x^k, $$ and the Stirling numbers of the second kind $S_2(n,k)$ satisfy $$ x^n = \sum_{k=0}^n S_2(n,k) (x)_k $$ where $(x)_n=x(x-1)\cdots(x-n+1)$ is the falling factorial (with $(x)_0=1$). Combining these yields $$ x^n = \sum_{k=0}^n \sum_{l=0}^k S_2(n,k) S_1(k,l) x^l $$ $$ = \sum_{l=0}^n x^l \left( \sum_{k=l}^n S_2(n,k) S_1(k,l) \right). $$ Comparing powers of $x$, we see that $$ \sum_{k=l}^n S_2(n,k) S_1(k,l) = \left\{ \begin{array}{lr} 1 & \mathrm{ if}\;\; l=n \\ 0 & \mathrm{ if}\;\; l\neq n \end{array} \right.$$ $$ = \delta_{ln}.$$

If we define $S_{\nu}(a,b)=0$ for $a<b$, then this is just the product of the $n^{th}$ row of $S_N$ and the $l^{th}$ column of $s_N$, i.e. the $(n,l)^{th}$ element of the matrix product $S_N s_N$ is $\delta_{nl}$. Thus their product is the identity matrix, and hence they're matrix inverses of eachother.

[Edit] Another way (less computation, slightly more hand-wavy) to see this is that $S_N$ is the change of basis matrix (for the space of polynomials) from $\{1,x,x^2,\dots\}$ to $\{(x)_0, (x)_1, \dots\}$, and $s_N$ is the change of basis matrix going the other way. Hence the linear transformation $S_Ns_N$ takes the coefficients in terms of $\{x^i\}$ to coefficients in terms of $\{(x)_i\}$ and then back to $\{x^i\}$, i.e. it's the identity.

Related Question