[Math] List of counting proofs instead of linear algebra method in combinatorics

big-listco.combinatoricslinear algebra

I've just come across this proof of the Graham-Pollak Theorem by Sundar Vishwanathan (thanks to Konrad Swanepoel's sporadic comments about it on this site), that must be called beautiful after its author. The paper mentions (after lemma 3) that certain other linear algebra proofs can also be replaced by similar pigeonhole principle applications. But I couldn't find any, so I thought that we should collect some similar beautiful proofs here. I don't mind if it also uses parity or other tricks, but linear algebra should be replaced! For example, can anyone prove the Oddtown theorem?

Best Answer

What essentially happens in this proof? We repeat the usual proof, but replace the fundamental linear algebraic theorem

$n$ vectors in $K^{n-1}$ are linearly dependent over $K$

for $K=\mathbb{Q}$ by its counting proof.

Let me give here the proofs for finite $K$ and for $K=\mathbb{Q}$ explicitly. Let $v_1,\dots,v_n$ be vectors in $K^{n-1}$, we need to find their linear dependence.

  1. $K$ is finite, $|K|=q$. There exist $q^n$ linear combinations of $v_1,\dots,v_n$, which take only $q^{n-1}$ values. Two of them are equal by pigeonhole principle, that's what we need.

  2. The same trick works for $\mathbb{Q}$, it is essentially noted by Sundar Vishwanathan. Choose at first a positive integer $N$ such that $u_i:=Nv_i\in \mathbb{Z}^{n-1}$. Then denote by $M$ the maximum of absolute values of coordinates of $u_1,\dots,u_{n}$. Consider all $(K+1)^n$ linear combinations $c_1u_1+\dots+c_n u_n$, where $c_i\in \{0,1,\dots,K\}$. They belong to the set $\{-nMK,-nMK+1,\dots,nMK-1,nMK\}^{n-1}$. So, if $(2nMK+1)^{n-1}<(K+1)^n$ (true for large $K$), by the pigeonhole principle there are two equal values of linear combinations, hence $u_1,\dots,u_n$ are linearly dependent with integer coefficients, as desired.

Applications of linear algebra in combinatorics often use this fundamental theorem either over finite fields, like in the Oddtown theorem (where the same counting argument works even easier) or over $\mathbb{Q}$. I think, we really need $\mathbb{R}$ or $\mathbb{C}$ only when we come to eigenvalues.

Say, the counting proof of Oddtown sounds as this:

Let $\mathcal{F}$ be a collection of subsets of $\{1,\dots,n\}$, $|{\mathcal F}|\geqslant n+1$. We consider all $2^{n+1}$ subcollections of ${\mathcal F}$. For each such subcollection $A$ consider the symmetric difference of $A$ (that is, the set of $x\in \{1,\dots,n\}$, which belong to odd number of sets from $A$.) Some two symmetric differences, say of $A$ and $B$ coincide by pigeonhole principle. Hence the symmetric difference of $C:=A\Delta B$ is empty set, i.e. each $x$ belong to even number of sets from $C$. Now let $C=\{U_1,\dots,U_k\}$. Then we use double counting: $$ \sum_{i=2}^k |U_i\cap U_1|=\sum_{x\in U_1}\sum_{i=2}^k \chi_{U_i}(x)\equiv |U_1| \pmod 2, $$ hence if $|U_1|$ is odd, at least one intersection $|U_i\cap U_1|$ has also odd size.

It is the usual proof with counting argument of linear dependence incorporated.