[Math] Essential differences between Runge-Kutta and Adams-Bashforth

numerical methodsnumerical-calculusordinary differential equationssoft-question

I do not quite understand the essential difference between these
methods. I understand that there are differences (which I wrote at the
bottom of this text), but what does it all mean in the end? In other
words, if the orders are the same, why should students learn these two
methods?

Let us keep the order of accuracy at 2 for both of these methods. Just to be clear, I will write out these well known formulas.

Though there are many, let us take a look at Heun's Runge-Kutta method of order 2 (RK2):
$$y_{k+1} = y_k + \frac{h}{2}[f(t_k, y_k) + f(t_k + h, y_k + hf(t_k,y_k))]$$

and compare it with the second order Adam Bashforth method (AB2):

$$y_{k+2} = y_{k+1}+\frac{3h}{2}f(t_{k+1}, y_{k+1})-\frac{h}{2}f(t_k,y_k)$$

Here's my understanding of their differences:

  1. They look different.
  2. RK2 is from Taylor series approximation, AB2 is from the quadrature's polynomial interpolation.
  3. AB2 is unique, while there are infinitely many formulas for RK2 (though only one of them has the lowest local error).
  4. RK2 is single step, AB2 is two-step.
  5. Both have their respective implicit versions of all orders, which can be used to perform PECE.

I see all these differences, yet still do not understand when one should prefer one over the other. I feel like I have the details, but I am missing out on the big picture.

Best Answer

  1. Yes, one-step and multi-step methods look different.

  2. The Adams-Bashford methods can also be derived via Taylor expansions. Comparing the Taylor expansions $$ y(x+h)=y(x)+y'(x)h+\tfrac12y''(x)h^2+O(h^3)\\ \text{ and }~y'(x-h)=y'(x)-y''(x)h+O(h^2), $$ the second derivative can be eliminated to give $$ y(x+h)=y(x)+y'(x)h+\tfrac12(y'(x)-y'(x-h))h+O(h^3)\\ =y(x)+\tfrac32y'(x)h-\tfrac12y'(x-h)h+O(h^3) $$ which is the A-B 2 formula.

  3. You get equally degrees of freedom in multi-step methods if you include more previous $y$ values, in the explicit case $y_{k+1}+\sum_{j=0}^{s-1}\alpha_jy_{k-j}=\sum_{j=0}^{s-1}\beta_jf_{k-j}$. The difference is that with $s$ steps back you can get an order $s$ method, which is not true for $s$-stage explicit Runge-Kutta methods for $s>4$.

  4. yes, see 1.

  5. The relation of explicit and implicit methods is not as close in the Runge-Kutta case as in the multistep case. The Gauß 2-stage implicit method has order 4, there is no closely related explicit method, especially with order 4.

Numerical test

For a practical demonstration, take the standard example of the Lorenz system with its chaotic attractor, select the initial value close to the attractor

sigma=10.
beta=8./3.
rho=28.
def Lorenz(t,u):
    x,y,z = u;
    return np.array([sigma*(y-x), x*(rho-z)-y, x*y-beta*z])

u0 = [-4,-4,21]; # alternatively [-7, -13, 11]

use scipy.integrate.odeint to compute a reference solution with sufficiently tight tolerances and apply the Heun and A-B 2 methods for different step sizes, pairing A-B 2 with Heun of twice the step size to have the same "density" of ODE function evaluations. The first plot below shows the comparison of the $x$ components under relatively large step sizes, stopping shortly before all the graphs diverge completely.

solutions and error profiles

The second and third plot show the error profiles, that is, the difference to the reference solution divided by the square of the step size. What one can conclude is that overall the solutions appear similar in their divergence points from the reference solution for similar step sizes. However the error profiles show that the error coefficient of Adams-Bashford is almost 4 times larger than the error coefficient of the Heun method, so that the error is similar for solutions with the same number of ODE function evaluation.