[Math] Algorithms on graphs of bounded degeneracy/arboricity

algorithmsco.combinatoricsgraph theory

I know that many graph problems can be solved very quickly on graphs of bounded degeneracy or arboricity. (It doesn't matter which one is bounded, since they're at most a factor of 2 apart.)

From Wikipedia's article on the clique problem I learnt that finding cliques of any constant size k takes linear time on graphs of bounded arboricity. That's pretty cool.

I wanted to know more examples of algorithms where the bounded arboricity condition helps. This might even be well-studied enough to have a survey article written on it. Unfortunately, I couldn't find much about my question. Can someone give me examples of such algorithms and references? Are there some commonly used algorithmic techniques that exploit this promise? How can I learn more about these results and the tools they use?

Best Answer

Bounded degeneracy or arboricity just means that the graph is sparse (number of edges is proportional to number of vertices in all subgraphs).

Some ideas that have been used for fast algorithms on these graphs:

  • Order the vertices so that each vertex has only d neighbors that are later in the ordering, where d is the degeneracy. Then if one can similarly order the structure one is looking for, there are not too many different choices to try. For instance (though this is not the method of the Chiba & Nishizeki paper that you indirectly refer to) one can find all cliques by trying all subsets of later neighbors of each vertex. This idea also works to color these graphs with at most d+1 colors: just choose colors for vertices one at a time in the opposite of the above ordering. See e.g. Matula and Beck, JACM 1983.

  • Find a low degree vertex, do something to it to reduce the size of the graph while preserving its overall sparsity, and continue. This is how one finds an ordering as above (repeatedly remove the smallest degree vertex) and is also how many planar coloring algorithms work.

  • Find a big independent set (or a big independent set of bounded-degree vertices), do something on it, and repeat on the remaining smaller graph. This often leads to linear time algorithms because every graph of bounded degeneracy has an independent set of Ω(n) vertices, so the size of the graph goes down by a constant factor at each repetition and the total time can be bounded by a geometric series. This is a variation of the "low degree vertex" idea that works better in the parallel algorithms setting.

  • Observe that there can only be very few vertices with high degree (O(dk) vertices with degree greater than n/k) or else they would have too many edges. So if you are looking for a structure that needs high degree vertices you don't have many choices to try. See e.g. Alon and Gutner, Algorithmica 2009.

Related Question