Section 1.8 Convergence, Consistency and Stability
The three concepts in the section's title are fundamental features to be owned by a numerical method. First we give a somehow abstract definition of what an ODE algorithm is:Definition 1.8.1.
By a fixed-stepsize \((k+1)\)-steps algorithm \(A_h\) to solve a first order ODE \(\dot x=f(t,x)\) on \(\bR^n\) we mean a map
\begin{equation*}
\Phi_h:\bR^{n(k+1)+1}\to\bR^n.
\end{equation*}
The \(h\) subindex above is the fixed time-step. Given a set of initial \(k+1\) points
\begin{equation*}
x_{-k},\dots,x_{-1},x_0,
\end{equation*}
the numerical solution of the ODE is defined recursively by
\begin{equation*}
x_{n+1} = \Phi_h(t_n,x_n,\dots,x_{n-k}),\;t_{n+1}=t_n+h.
\end{equation*}
Example 1.8.2.
\begin{equation*}
\Phi_h(t,x) = x + h f(t,x).
\end{equation*}
The one corresponding to the Heun method is
\begin{equation*}
\Phi_h(t,x) = x + \frac{h}{2}\left( f(t,x) + f(t+h, x+h f(t,x))\right).
\end{equation*}
The one corresponding to the Adams-Bashforth method of order 2 is
\begin{equation*}
\Phi_h(t,x,y) = x + \frac{h}{2}\left( 3f(t,x) - f(t-h, y)\right).
\end{equation*}
Definition 1.8.3.
A fixed-stepsize ODE algorithm \(\Phi_h\) is convergent if, given any IVP
\begin{equation*}
\dot x=f(t,x),\;x(t_0)=x_0,
\end{equation*}
where \(f(t,x)\) satisfies the conditions for the existence and uniqueness of the solution, and any \(t_f>t_0\) for which there exist a solution, the numerical solution \(\{x_1(h),\dots,x_N(h)\}\) converges to the exact solution \(x(t)\) as \(h\to0\text{,}\) namely
\begin{equation*}
\lim_{h\to0}\max_{1\leq k\leq N}\{\|x_k-x(t_0+kh)\|\}=0.
\end{equation*}
Definition 1.8.4.
A fixed-stepsize ODE algorithm \(\Phi_h\) is consistent if, given any ODE \(\dot x=f(t,x)\) satisfying the conditions for the existence and uniqueness of the solution and any \(h\) small enough, the error of a single step of the algorithm goes to 0 as \(h\to0\text{.}\)Theorem 1.8.5.
Suppose that a one-step algorithm \(\Phi_h\) has local error of order \(O(h^{k+1})\text{.}\) Then \(\Phi_h\) has global error order \(O(h^{k})\text{.}\)Corollary 1.8.6.
For one-step ODE algorithms whose local error order is more than 1, consistency implies convergence.Definition 1.8.7.
A fixed-stepsize ODE algorithm \(\Phi_h\) is stable if, given any ODE \(\dot x=f(t,x)\) satisfying the conditions for the existence and uniqueness of the solution and two points \(x_0,x'_0\text{,}\) the two corresponding numerical solutions \(\{x_1(h),\dots,x_N(h)\},\{x'_1(h),\dots,x'_N(h)\}\text{,}\) where \(Nh=t_f-t_0\text{,}\) are such that
\begin{equation*}
\|x_N(h)-x'_N(h)\|\leq C\|x_0-x'_0\|
\end{equation*}
for each small enough \(h\) and some constant \(C\) independent on \(h\text{.}\)Theorem 1.8.8.
A multi-step ODE algorithm is convergent if and only if it is stable and consistent.Definition 1.8.9.
A fixed-stepsize ODE algorithm \(\Phi_h\) is linearly stable (or A-stable, or Asymptotically stable) if, given any linear ODE \(\dot x=Kx\text{,}\) for any initial condition \(x_0\) we have that the corresponding numerical solution \(\{x_n(h)\}\text{,}\) for any fixed value of \(h\text{,}\) converges to 0 for \(n\to\infty\)Remark 1.8.10.
This concept of stability is not necessarily about how close \(\{x_n(h)\}\) gets to the exact solution but rather about whether, as a discrete dynamical system, \(\Phi_h\) is stable in a neighborhood of the 0 point. Indeed, the constant function \(x(t)=0\) is always a solution of a linear ODE and so a linearly stable method \(\Phi_h\) has the property that the orbit of any point converges asymptotically to this solution.
\begin{equation*}
\dot x = K x,\;K\in\bC
\end{equation*}
with \(Re(K)<0\text{.}\) In this case all solutions go to zero asymptotically and so any stable ODE algorithm will reproduce correctly this behavior. We will see in next section that, when \(Re(K)<0\text{,}\) the explicit Euler method is linearly stable only for \(h\) small enough, while the implicit method is so for every \(h\)