MacMahon Master Theorem – Non-Matching Coefficients

co.combinatoricspolynomialsreference-request

Let $ A$ be a complex $ n$ by $ n$ matrix and $ x_1, \dots, x_n$ be a set of commuting variables. Let $ X_i = \sum_i a_{ij}x_j$. MacMahon's Master Theorem (MMT) states that
\begin{align}
[x_1^{p_1} \dots x_n^{p_n}] X_1^{p_1} \dots X_n^{p_n} = [s_1^{p_1} \dots s_n^{p_n}] \det(I- SA)^{-1}
\end{align}

where $[m]P$ means the coefficient of the monomial $m$ in the polynomial or formal power series $P$ and $ S = \text{Diag}(s_1,\dots, s_n)$. Is there any similar result in the literature for
\begin{align}
[x_1^{p_1} \dots x_n^{p_n}] X_1^{q_1} \dots X_n^{q_n}
\end{align}

where $ \sum_i p_i = \sum_i q_i$ ?

There are numerous generalizations of MMT. However, I am
unable to find any result for this particular case.

Best Answer

The reason I asked this question is that I found such a generalization of MMT and didn't know if it exists in the literature. The proof makes extensive use of operator calculus: the differential operators are manipulated as if they were numbers. The MMT and its generalization are immediate consequences of the following result.

Lemma. Let $S,A$ be $ n\times n$ matrices of commuting variables, then \begin{align}\label{eq:sa} \exp ( \partial_x^T S \partial_y ) \exp (y^T A x)|_{x=y=0} = \det(I-S A)^{-1} \end{align} where both sides are interpreted as formal power series in $ (s_{ij}), (a_{ij})$.

Proof. Let $C, M$ be symmetric $n \times n$ matrices and $ X \sim \mathcal N(0, C)$ (formally). Then \begin{align*} \exp \left( \frac{1}{2}\partial_u^T C \partial_u \right) \exp \left(-\frac{1}{2}u^T M u \right) \Big|_{u=0} &= \mathbb E \exp(\partial_u^T X) \exp \left( -\frac{1}{2}u^T M u \right) \Big|_{u=0} \\ &= \mathbb E \exp \left( -\frac{1}{2}X^T M X \right) \\ &= \det(I+CM)^{-1/2}, \end{align*} The result of the lemma is obtained by choosing \begin{align*} C = \begin{pmatrix} O & S \\ S^T & O \end{pmatrix}, \quad M=-\begin{pmatrix} O & A^T \\ A & O\end{pmatrix}, \quad u=\begin{pmatrix} x \\ y\end{pmatrix} \quad \blacksquare \end{align*}

Now let $ E$ be any subset of $ [n]^2$ and $ S$ be a $ n$ by $ n$ matrix supported in $ E$, i.e. $ s_{ij}=0$ if $ (i,j) \notin E$. By the lemma \begin{align} \det(I-S A)^{-1} &= \exp \left( \sum_{(i,j)\in E} s_{ij} \partial_{x_i} \partial_{y_j} \right) \exp(y^T A x)|_{x=y=0} \end{align} which gives \begin{align} \left[ \prod_{(i,j)\in E} s_{ij}^{k_{ij}} \right] \det(I-SA)^{-1} &= \left( \prod_{(i,j) \in E} \frac{\partial_{x_i}^{k_{ij}} \partial_{y_j}^{k_{ij}} }{k_{ij}!} \right) \exp(y^T A x)|_{x=y=0}\\ &= \frac{1}{\prod_{(i,j)\in E} k_{ij}!} p_1!\dots p_n! [x_1^{p_1} \dots x_n^{p_n}] X_1^{q_1} \dots X_n^{q_n} \end{align} where \begin{align} p_i = \sum_i k_{ij}, \quad q_j = \sum_j k_{ij} \tag{1} \end{align} Therefore \begin{align} [x_1^{p_1} \dots x_n^{p_n}] X_1^{q_1} \dots X_n^{q_n} = \frac{\prod_{(i,j)\in E} k_{ij}!}{p_1!\dots p_n!} \left[\prod_{(i,j)\in E} s_{ij}^{k_{ij}} \right] \det(I - SA)^{-1} \tag{2} \end{align} When $E$ is the diagonal we get the MMT. For general $ p_1, \dots, p_n, q_1, \dots, q_n$ such that $\sum_i p_i = \sum_i q_i$ we can find a $E \subset [n]^2$ with no more than $ 2n+1$ elements and nonnegative integers $k_{ij}$ for $(i,j)\in E$ such that $(1)$ satisfies, and $(2)$ gives a generalization of MMT.

Related Question