[Math] Solving PDE via Cellular Automata

ap.analysis-of-pdescellular automatareference-request

Is there a theory for solving PDE by using Cellular Automata ? Something which is on the line of, passing to the limit (scale) i.e., if you increase the number of grid points the solution to the cellular automata will converge to the PDE ? If so, how successful is this approach ? What are the limitations of this approach. Also, given a PDE how does one go about finding the rules for the corresponding cellular automata and vice versa ?

Best Answer

There are a number of PDEs that have been fruitfully attacked using cellular automata. First among these are various incarnations of the Navier-Stokes equations, which can be (and in geophysical or other complex flow applications, frequently are) simulated with lattice gases and lattice Boltzmann methods. Other PDEs that CAs can handle include diffusion and reaction-diffusion equations and wave equations. Another one that is more widely known is the random walk treated as a random CA, which can be used to tackle the heat equation.

Will Jagy's guess about hexagons anticipates (if we think about triangles instead) the improvement that so-called FHP models offer over HPP models; in higher dimensions the lattice issues get trickier. I have some unpublished and cryptic notes about a possible new approach in 3D using the root lattice $A_4$ and permutohedral boundary conditions.

One of the main advantages of the CA approach is the ability to work with complicated boundary shapes, though on the other hand the boundary conditions are a very delicate issue in general.

A very nice (though somewhat dated) reference for this and related problems is Chopard and Droz (a PDF of the opening parts is here) and IIRC you can find a paper by one of the authors online that covers some of the same topics with a similar approach.

Related Question