[Math] Newton’s method to solve implicit Runge-Kutta-method

numerical methodsordinary differential equations

I'm having a bit of a problem to solve an initial value problem by employing an implicit s-step Runge-Kutta method (and Newton's method). More precisely, I don't know how to employ Newton's method in this case. The initial value problem is pretty standard:

\begin{equation*}
y'(t) = f(t,y(t)), \quad y(t_0) = 0 =:y_0 \in \mathbb R^n, \quad f: \mathbb R \times \mathbb R^n \to \mathbb R.
\end{equation*}
I'm also given $f'$ (the Jacobian of $f$) and a discrete time grid $(t_0, \ldots, t_N)$ as well as the Butcher tableau regarding the Runge-Kutta method:
\begin{equation*}
\begin{array}{c|c}
c&A \\ \hline
&b
\end{array}, \quad c \in \mathbb R^s, A \in \mathbb R^{s,s}, b \in \mathbb R^{1,s}.
\end{equation*}
Now I have to calculate the $k_i$ such that I can calculate $y_1 = y_0 + \sum_{i=1}^s b_ik_i.$ Since $A$ isn't necessarily a strictly lower triangular matrix, or even triangular at all, I have to solve a set of linear (or nonlinear) equations to calculate the $k_i$. So my first system of equations reads as follows:
\begin{align*}
k_1 &= f\left(t_0+c_1h,y_0+h\sum_{i=1}^sa_{1i}k_1\right)\\
k_2 &= f\left(t_0+c_2h,y_0+h\sum_{i=1}^sa_{2i}k_1\right)\\
&\,\,\vdots\\
k_s &= f\left(t_0+c_sh,y_0+h\sum_{i=1}^sa_{si}k_1\right),
\end{align*}
with $h := t_1 – t_0$. To solve this system, I have to use Newton's method. That's the part where I'm stuck. My current idea is this: Let $K := (k_1, \ldots, k_s)^T$ and define
\begin{equation*}
F : \mathbb R^{s\cdot n} \to \mathbb R^{s \cdot n}, \quad K \mapsto \begin{pmatrix}
f_1(K)-k_1\\
f_2(K)-k_2\\
\vdots\\
f_s(K)-k_s
\end{pmatrix}
\end{equation*}
with
\begin{equation*}
f_j(K) = f\left(t_0+c_jh,y_0+h\sum_{i=1}^s a_{ji}k_i \right).
\end{equation*}
Then I would start, as suggested in the assignment, with $k_1 = k_2 = \ldots = k_s = 0$ to find a $\tilde{K}$ such that $F(\tilde{K}) = 0$ via Newton's method, which would solve the system of equations and thusly allow me to calculate $y_1$. Then I could do the same for $y_2, y_3, \ldots$. I do however need the Jacobian of my function $F$ and I'm inadept to calculate it simply given the Jacobian of $ f $ I'm presented with.

I suppose it'd look like this:
\begin{align*}
J_F(K) &= \begin{pmatrix}
\frac{\partial f_1-k_1}{\partial k_1} (K)&
\frac{\partial f_1-k_1}{\partial k_2} (K)& \cdots &
\frac{\partial f_1-k_1}{\partial k_s}(K)\\
\frac{\partial f_2-k_2}{\partial k_1}(K)&
\frac{\partial f_2-k_2}{\partial k_2} (K) & \cdots &
\frac{\partial f_2-k_2}{\partial k_s}(K)\\
\vdots & & \ddots & \vdots\\
\frac{\partial f_s-k_s}{\partial k_1} (K)& \cdots & &
\frac{\partial f_s-k_s}{\partial k_s}(K)
\end{pmatrix}
\\&= \begin{pmatrix}
\left(\frac{\partial f_1}{\partial k_1}-1\right) (K)&
\frac{\partial f_1}{\partial k_2} (K)& \cdots &
\frac{\partial f_1}{\partial k_s}(K)\\
\frac{\partial f_2}{\partial k_1}(K)&
\left(\frac{\partial f_2}{\partial k_2}-1\right) (K) & \cdots &
\frac{\partial f_2}{\partial k_s}(K)\\
\vdots & & \ddots & \vdots\\
\frac{\partial f_s}{\partial k_1} (K)& \cdots & &
\left(\frac{\partial f_s}{\partial k_s}-1\right)(K)
\end{pmatrix}.
\end{align*}

Note: I do not know which function I'm given, nor do I know which Runge-Kutta method I actually have to solve. I'm writing a program in MATLAB and get these information fed as part of the assignment. Moreover, the solving of the problem is slightly urgent.

Best Answer

You have to use the chain rule $$ \frac{\partial f_i}{\partial k_j}=\frac{∂f}{∂y}a_{ij} $$

Related Question