[Math] operation count for computing the determinant of an $n\times n$ matrix $A$ directly is $O(n!)$

determinantlinear algebramatrices

I am looking for some help with the following question. I have some ideas below but I am not quite sure how to execute this answer.

Show that the operation count for computing the determinant of an $n\times n$ matrix $A$ directly is $O(n!)$ and thus for large $n$ significantly worse than using the LU
factorization.

If we denote $d_n$ the number of operations – additions and multiplications – required to calculate an $n \times n$ determinant by the way of cofactor expansion. For example a $2 \times 2$ matrix $A = a_{ij}$

$$detA = a_{11}(a_{22})-a_{12}(a_{21})$$

Hence, two multiplications and one addition are necessary, and so $d_2 = 3$

This is just an example of a case, but how do I prove my question generally?

Best Answer

Usually you define the determinant via

$$\det(A)=\sum_{\sigma\in S_n}\bigg(sgn(\sigma)\prod_{i=1}^nA_{i,\sigma(i)}\bigg)$$

So the inner product

$$\prod_{i=1}^n A_{i, \sigma(i)}$$

does exactly $n$ multiplications. Multiplying by $sgn(\sigma)$ is $O(1)$ so all in all

$$O\bigg(sgn(\sigma)\prod A_{i, \sigma(i)}\bigg)=O(n)$$

for a fixed $\sigma$. I also assume that calculating $sgn(\sigma)$ is $O(1)$. Now $S_n$ has exactly $n!$ elements so

$$O\bigg(\sum_{\sigma\in S_n}sgn(\sigma)\prod A_{i, \sigma(i)}\bigg)=O\bigg(\sum_{\sigma\in S_n}n\bigg)=O\bigg(n\cdot\sum_{\sigma\in S_n}1\bigg)=O(n\cdot n!)$$

which finally gives $O(n\cdot n!)$, not $O(n!)$.

Although I didn't prove that it is not $O(n!)$ (see: the big-O notation ) it actually is not $O(n!)$ because $\det(A)$ does precisely $n!\cdot(n+1)$ arithmetic operations (I ignore the fact that you have to generate the set of all permutations $S_n$ because it can be done in $O(n!)$) by the former argumentation. The $n!$ comes from summation over $S_n$, the $n$ from inner $\prod$ product and $+1$ from multiplying by $sgn(\sigma)$. Or in other words $\det(A)$ is $\Theta(n\cdot n!)$ - the big-Theta notation.

Also note the problem with the big-O notation: constant function is $O(n!)$. Does it mean it is slow? No. Whoever gave you the exercise most likely meant some other growth describing notation, e.g. the big-Theta.

Related Question