What is known about compressing graphs? Here, with "compressing", I mean something like "putting a graph into a zip program"; or with a more technical expression, what is know about the Kolmogorov complexity of a graph? Does it make sense to define something in this line at all? I guess one needs a binary string, first of all. Is there then a way to get a binary string from a graph in some unequivocal way, without having to deal with labeling issues, permutations, etc.?
[Math] Compressing Graphs (Kolmogorov complexity of graphs)
co.combinatoricscomputational complexitygraph theoryit.information-theory
Related Solutions
I don't know the answer, just some thoughts.
Suppose you have $n$ vertices of the first type and $l$ of the second. Of course, you know the answer if there is no restriction on the number of edges (you want it to be odd). Therefore equivalently I can calculate cases, when it is odd with coefficient (-1) and cases, when it is even with coefficient 1 (I denote this amount by $N$). Let $y_1,\dots,y_n,z_1,\dots,z_l$ be elements of $\mathbb{Z}\_2=\mathbb{Z}/(2\mathbb{Z})$, corresponding to the vertices of the graph. Edges are given by $n\times l$ matrix $A$. Then $N$ is the coefficient of $h^m$ in $S(h,h)$, where $S$ is defined by
$$S(h,g)=\sum_{y\in\mathbb{Z}\_2^n,\;z\in\mathbb{Z}\_2^l} h^{|y|} g^{|z|} (-1)^{\sum_{i,j}y_i A_{ij} z_j}.\tag{1}$$
Here $|y|$ is amount of components in $y\in \mathbb{Z}\_2^n$ which are equal to $1$. One can get rid of the sum over $y\in\mathbb{Z}\_2^n$ in the following way:
$$S(h,g)=\sum_{z\in\mathbb{Z}\_2^l} g^{|z|}\prod_{i=1}^{n} (1+h(-1)^{\sum_j A_{ij} z_j}).\tag{2}$$
This for example solves the problem in two cases:
1. we fix the number of vertices only in one part of a bipartite graph and the map $\mathbb{Z}_2^l\to\mathbb{Z}_2^n\colon z\mapsto x$ with $x_i=\sum_j A_{ij} z_j$ is surjective (in this case we can omit $g^{|z|}$ in $(2)$, make described change of coordinates and apply again the trick we used to get (2) from (1));
2. amount of vertices in one part of a bipartite graph is small enough (in this case the sum (2) has small enough number of terms and each of them can be calculated in a polynomial time);
Hi Scott,
are you asking whether CSoph(x) may be much more NSoph_c(x) for every constant c? This question looks strange as NSoph_c(x) increases, as c decreases. Thus the question reduces to the case c=0 (or may be you allow negative values?). Which question precisely did you have in mind? Note also that Kolmogorov complexity K(x) itself is defined up to an additive constant term (BTW, I assume that you meant the plain complexity and not prefix one). Thus you should specify the quantifier over K(x).
One reading of your question is the following: Is it true that for all c, all K and all d there is a x with CSoph(x) > NSoph_c(x) + d Another reading is the following: Is it true that for all c there is K such that for all d there is a x with CSoph(x) > NSoph_c(x) + d
Kolia.
P.S. May be you will be interested also in the following result from our joint with Paul Vitanyi paper from FOCS 2002: if we allow logarithmic changes of c than soph and NSoph coincide (with logarithmic precision): soph_{c+O(\log K(x))}(x) < Nsoph_c(x) +O(\log K(x)).
Best Answer
Li and Vitányi in their standard textbook on Kolmogorov complexity (3rd edition, p.456) observe
This is made more precise in Section 6.4. In particular, they show that the number of distinct isomorphism classes of undirected graphs with $n$ vertices asymptotically approaches $2^\binom{n}{2}/n!$, with only a small error.
By Stirling's approximation, the Kolmogorov complexity of undirected graphs with $n$ vertices can then be seen to be close to $n(n-1)/2 + c$ bits. For undirected graphs the adjacency matrix is symmetric and has 0's on the diagonal, so one only needs to store the $n(n-1)/2$ bits above the diagonal.
This makes more precise the claim made in another answer, that the adjacency matrix representation is optimal. (Note that close in this argument may still be an unbounded function of $n$; see also Lemma 6.4.6 and the comment after it.)