[Math] comparison of simplex and shortest path method

graph theorylinear programming

In mathematical optimization, Dantzig's simplex algorithm (or simplex method) is a popular algorithm for linear programming. The journal Computing in Science and Engineering listed it as one of the top 10 algorithms of the twentieth century.
The name of the algorithm is derived from the concept of a simplex and was suggested by T. S. Motzkin. Simplices are not actually used in the method, but one interpretation of it is that it operates on simplicial cones, and these become proper simplices with an additional constraint. The simplicial cones in question are the corners (i.e., the neighborhoods of the vertices) of a geometric object called a polytope. The shape of this polytope is defined by the constraints applied to the objective function.

In graph theory, the shortest path problem is the problem of finding a path between two vertices (or nodes) in a graph such that the sum of the weights of its constituent edges is minimized.
This is analogous to the problem of finding the shortest path between two intersections on a road map: the graph's vertices correspond to intersections and the edges correspond to road segments, each weighted by the length of its road segment.

My question is: can I use one as an alternative to another and, if so, which one is better and when?

Best Answer

The mathematical objects that these two algorithms work on are very different, one is a set of linear constraints (linear inequalities and a linear optimization function), the other a directed graph with weighted edges.

But both are used to solve optimization problems and there are all sorts of instances in mathematics where seemingly different things are somehow equivalent.

It turns out that one can convert the graph (of the shortest paths problem) into a linear optimization problem (I won't go into the details here, a perfectly good and succint explanation is on wikipedia).

Now the issue is the algorithms. An algorithm is not a problem; it is a manner of solving the problem. Are the two -algorithms- somehow similar? You could explore how the transform from graph to linear constraints is acted on by the simplex algorithm to se how that corresponds to Dijkstra's algorithm (you didn't specify the shortest path method, but that is one of them. There are others, like Bellman-Ford.) You might get something out of that but I don't see much there other than the transformation.

So you can solve shortest path problems using a linear optimization problem, but not the other way, Not all linear optimization problems can be converted to a shortest path problem.

Now to the second part of your exact question, which one is better and when. Which one is better really refers now to shortest path problems run by a particular algorithm, either (I will choose) Dijkstra's or the graph transformed and solved by Dantzig's simplex algorithm.

A good implementation of Dijkstra's is $O(E + V \log V)$ (where E is the number of edges and V is the number of vertices.

Off the top of my head, Dantzig's on the converted graph is going to be $O(V^2)$ at least because of the manipulation of the matrix of constraints.

Also, if you had to implement either from scratch, I feel like Dijkstra's is easier to implement that Dantzig's.

So there you go, final answer: use Dijkstra's (or another algorithm specifically for shortest paths in graphs), rather than a much more general linear optimization algorithm.

Related Question