[Math] Direct evaluation of Multivariate Complex Gaussian Integral without using analytic continuation

calculuscomplex-analysiscontour-integrationgaussian-integral

Consider the integral

$$
\int_{\mathbb{R}^n}dx\,e^{-\frac12 x^TAx}=\frac{(2\pi)^{n/2}}{\sqrt{\det A}}
$$

where $A=A^T$ is a symmetric $n\times n$ complex matrix with positive definite real part.

Question: can we explicitly calculate this integral (for complex $A$) without using analytic continuation?

Motivation: the standard proof of the above result starts off with a real $A$ and uses Cholesky decomposition to decouple the integral into $n$ one-dimensional Gaussian. (Diagonalizing $A$ with an orthogonal matrix with Jacobian $J=1$ essentially does the same.) Then one argues that, as long as the real part of $A$ remains positive definite, both sides are holomorphic and by analytic continuation the integral must have the value of the right hand side even for complex $A$, (see a good discussion on this here).

My question is motivated by the observation that for $n=1$ everything is scalar, $\det A=A$, and one can prove the above result for $\Re A>0$ using Cauchy theorem and contour integration with complex Jacobian $J=\sqrt{A}$, $\arg\sqrt{A}\in(-\pi/4,\pi/4)$. There is no need for analytic continuation (unless of course you want to go to $\Re A<0$), see the proof here.

So I wonder if there exists a direct proof for $n>1$ using some variant of Cauchy theorem in $\mathbb{C}^n$? Or some other way of integration with substitution using complex Jacobians, without having to rely on analytic continuation?

Best Answer

Proposition. Given a symmetric complex $n\times n$ matrix $S=S^T\in {\rm Mat}_{n\times n}(\mathbb{C})$ with real part ${\rm Re}(S)>0$ positive definite, then (i) the matrix $S$ is invertible, and (ii) the Gaussian integral is $$ I~:=~\int_{\mathbb{R}^n} \! d^nx ~e^{-\frac{1}{2}x^T Sx} ~=~\sqrt{\frac{(2\pi)^n}{\det S }}. \tag{1}$$

Remark. Note that the condition$^1$ $${\rm Re}(S)~>~0\qquad\Leftrightarrow\qquad \forall x~\in~\mathbb{R}^n\backslash\{0\}:~~x^T{\rm Re}(S) x~>~0 \tag{2}$$ ensures that the integrand (1) is Lebesgue integrable.

Remark. We will below give a proof of the proposition that does not rely on analytic continuation of the $S$ matrix, as requested by OP.

Induction proof of the proposition. The case $n=1$ is e.g. proven in my Phys.SE answer here. Next consider the induction step. Write $$ S~=~\begin{pmatrix} a & b^T \cr b & C \end{pmatrix}, \qquad a~\in~\mathbb{C}, \quad b~\in~\mathbb{C}^{n-1}, \quad C~\in~{\rm Mat}_{n-1 \times n-1}(\mathbb{C}). \tag{3}$$ It follows that $${\rm Re}(a)~\stackrel{(2)+(3)}{>}~0, \tag{4}$$ which implies that $a\neq 0$. Define $$ \gamma~:=~\alpha + i \beta~:=~b/a~\in~\mathbb{C}^{n-1}, \qquad \alpha , \beta~\in~\mathbb{R}^{n-1}, \tag{5}$$ $$ S^{\prime}~:=~C-\gamma a \gamma^T ~\in~{\rm Mat}_{n-1 \times n-1}(\mathbb{C}), \tag{6}$$ and $$ \widetilde{S}^{\prime}~:=~S^{\prime}-\beta a \beta^T ~\in~{\rm Mat}_{n-1 \times n-1}(\mathbb{C}). \tag{7}$$ Then $$ S~\stackrel{(3)+(5)+(6)}{=}~\begin{pmatrix} 1 & 0 \cr \gamma & \mathbb{1} \end{pmatrix} \begin{pmatrix} a & 0 \cr 0 & S^{\prime} \end{pmatrix} \begin{pmatrix} 1 & \gamma^T \cr 0 & \mathbb{1} \end{pmatrix} ~\stackrel{(9)}{=}~\begin{pmatrix} 1 & 0 \cr \alpha & \mathbb{1} \end{pmatrix} \widetilde{S} \begin{pmatrix} 1 & \alpha^T \cr 0 & \mathbb{1} \end{pmatrix}, \tag{8}$$ where $$ \widetilde{S}~:=~\begin{pmatrix} 1 & 0 \cr i\beta & \mathbb{1} \end{pmatrix} \begin{pmatrix} a & 0 \cr 0 & S^{\prime} \end{pmatrix} \begin{pmatrix} 1 & i\beta^T \cr 0 & \mathbb{1} \end{pmatrix} ~\stackrel{(7)}{=}~\begin{pmatrix} a & i\beta^T \cr i\beta & \widetilde{S}^{\prime} \end{pmatrix} . \tag{9}$$ Eq. (8) implies that ${\rm Re}(\widetilde{S})>0$. Eq. (9) then implies that ${\rm Re}(\widetilde{S}^{\prime})>0$. Eq. (4) implies that ${\rm Re}(\beta a \beta^T)\geq 0$. Eq. (7) then implies that ${\rm Re}({S}^{\prime})>0$. [Here we have used that (semi)positive-definite matrices form a convex cone.] By induction we then know that ${S}^{\prime}$ is invertible. Eq. (8) then implies that $S$ is invertible. The integral becomes $$ \begin{align}I~\stackrel{(1)+(8)}{=}&~\int_{\mathbb{R}^n} \! d^nx ~e^{-\frac{1}{2}x^T \widetilde{S}x}\cr ~\stackrel{(9)}{=}~&\int_{\mathbb{R}^{n-1}} \! d^{n-1}x^{\prime} ~e^{-\frac{1}{2}x^{\prime T} S^{\prime}x^{\prime}} \int_{\mathbb{R}+i\beta^Tx^{\prime}} \! dz~e^{-\frac{1}{2}az^2}\cr ~=~&\int_{\mathbb{R}^{n-1}} \! d^{n-1}x^{\prime} ~e^{-\frac{1}{2}x^{\prime T} S^{\prime}x^{\prime}} \int_{\mathbb{R}} \! dz~e^{-\frac{1}{2}az^2}\cr ~=~&\sqrt{\frac{(2\pi)^{n-1}}{\det S^{\prime}}} \sqrt{\frac{2\pi}{a}}\cr ~\stackrel{(8)}{=}~&\sqrt{\frac{(2\pi)^n}{\det S}} . \end{align}\tag{10}$$ To perform the 1-dimensional $z$-integral, we have used the fact that it is possible to shift the horizontal integration contour back to the real axis $\mathbb{R}$ without changing the integral value, cf. Cauchy's integral theorem. $\Box$

--

$^1$In this answer, the notation $M>0$ means a positive-definite matrix $M$; not a matrix with positive entries.

Related Question