[Math] Average size of determinants of integer matrices

linear algebramatrix-theorynt.number-theory

I am interested in estimating how large determinants of matrices tend to be 'on average' given the following model: suppose we form $n \times n$ matrices $M$ such that all of the entries of $M$ are integers, and the entries of the $i$th row of $M$ are bounded by some positive parameter $k_i$. Then by expressing the determinant as a polynomial in the entries, we see that
$$\displaystyle |\det(M)| \leq n! (k_1 \cdots k_n).$$
However, this estimate seems to be too large for an estimate for the mean, as on average one would expect significant cancellation to make the determinant small. Thus to pose my question formally:

Consider the set of $n \times n$ matrices with integer entries such that the absolute value of each entry in the $i$th row is bounded by the parameter $k_i > 0$. Let $M(k_1, \cdots, k_n)$ denote this set of matrices. Then what is the average value of the absolute value of determinant of the elements in $M(k_1, \cdots, k_n)$? Let $\mu(k_1, \cdots, k_n)$ denote this average. Given $\epsilon > 0$, can one give an estimate to how many matrices $M$ in $M(k_1, \cdots, k_n)$ satisfy
$$\displaystyle (1 – \epsilon)(\mu(k_1, \cdots, k_n)) \leq |\det(M)| \leq (1 + \epsilon)(\mu(k_1, \cdots, k_n))?$$
Thanks for any insight on the matter.

Best Answer

As noted in Will's comment above, it's easy to compute the expected square of the determinant. More precisely, we have $$E(\det M^2)=n! \prod \frac{k_i (k_i+1)}{3}.$$

Let $M'$ be formed from $M$ by dividing each row by $\left(\frac{k_i(k_i+1)}{3}\right)^{1/2}$. Now each entry has mean $0$ and variance $1$, and furthermore the entries are bounded. The determinants of such matrices as the size of the matrix tends to infinity have been well studied, by Girko, Tao and Vu, and Nguyen and Vu, among others.

For example, it follows from Theorem 1.1 in the Nguyen and Vu paper linked above that $\log |det(M')|$ is asymptotically normal with mean $\frac{1}{2} \log((n-1)!)$ and variance $\log n$. Taking this and rescaling back to $M$, we have that with probability tending to $1$ as $n$ tends to infinity that $$ det M^2 = \ n^{-1+o(1)} E(det M^2).$$

In particular, the squared determinant is almost surely concentrated in a short interval which does not contain its expectation!

Related Question