Zero-Stability, Consistency and Convergence Theorem

6.3. Zero-Stability, Consistency and Convergence Theorem#

Definition 6.6 (Zero-Stability)

A linear multistep method (6.1) is said to be zero-stable if it satisfies the root condition given in Definition 6.3.

This means no root of the first characteristic polynomial \(\,\rho(z)\,\) has modulus greater that one, and every root of modulus one is simple.

Definition 6.7 (Consistency)

In numerical analysis, consistency is the property of a numerical method performing differentiation or integration in a way similar to the actual differentiation or integration. In another word, consistency tells how similar is the discrete system to the original continuous system.

The condition for the discrete system produced by a linear multistep method (6.1)

\[ \sum_{i=0}^{k}\alpha_i y_{j+i}~=~ h\sum_{i=0}^{k}\beta_i f_{j+i}, \qquad \text{with}~ \alpha_k =1 \]

being consistent with the continuous differential equation

\[ \diff{y}{x} = f(x, y) \]

or its integral

\[ \int_{x_j}^{x_{j+k}} \diff{y}{x} \dx = \int_{x_j}^{x_{j+k}} f(x, y) \dx \]

is that the method has an order of accuracy \(p \geq 1\).

In terms of error constants \(C_p\) introduced in Section 5.2, the condition of consistency is equivalent to \(C_0 = 0\), \(C_1 = 0\). This ensures the method has an order of accuracy \(p\geq 1\). Thus the condition of consistency can also be expressed by the relations

\[ \begin{aligned} && \sum_{i=0}^{k} \alpha_i ~&=~ 0\,; & \sum_{i=0}^{k} i\alpha_i ~&=~ \sum_{i=0}^{k} \beta_i\,. && \end{aligned} \]

And in terms of the first and second characteristic polynomials \(\rho(z)\) and \(\sigma(z)\), the condition of consistency can be expressed by the following relations - i.e. the linear multistep method (6.1) is consistent iff:

\[ \begin{aligned} \rho(1) ~&=~ 0\,, \qquad\text{and}\qquad \rho'(1) = \sigma(1)\,. \end{aligned} \]

Thus, for a consistent method, the first characteristic polynomial always has a root at \(+1\), i.e. the principal root \(z_1\). The remaining roots \(\,z_i\,, ~i=2,3,\dots,k\,\), are the parasitic roots.

Remark 6.1 (Condition of Consistency)

In summary there are three equivalent conditions of consistency:

  1. The method’s order of accuracy \(p \geq 1\).

  2. The error constants satisfy \(C_0=0\) and \(C_1=0\).

  3. \(\rho(1)=0\) and \(\rho'(1)=\sigma(1)\).

If a linear multistep method satisfies any of these three conditions, then it is a consistent method.

A linear multistep method which is zero-stable is frequently called ‘stable’. Essentially a method is stable if errors produced at one step are not propagated (or amplified) at subsequent steps.

In addition, it is also important for us to know whether the numerical solution \(y\)num will approach the exact solution \(y\)ex when we reduce the step size \(h\) towards zero.

Definition 6.8 (Convergence)

For an initial value problem

\[\begin{split} \left\{ \begin{aligned} y'&=f(t, y) \\ y(0)&=y_0 \end{aligned} \right. \qquad t \in [0, T], \end{split}\]

when we reduce the step size \(h\) towards zero, if the numerical solution \(y\)num approaches to the exact solution \(y\)ex, we says the numerical solution converges to the exact solution, and we call the numerical method a convergent method.

\[ \max _{t\in [0, T]} \| y_\text{num}(t) - y_\text{ex}(t) \| \rightarrow 0 \qquad \mbox{as} \qquad {h}\rightarrow 0. \]

Zero-stability ensures that the parasitic solutions of the general solution to a difference equation are damped out in the limit as \(\,h \to 0\,\) — this effect is related to the fundamental theorem on linear multistep methods by Dahlquist:

Theorem 6.2 (Dahlquist Equivalence)

The necessary and sufficient conditions for a linear multistep method to be convergent are that it is consistent and zero-stable.

Example 6.4

Investigate the convergence of the third order Adams-Bashforth formula:

\[ \begin{aligned} y_{j+1} ~=~ y_j + \frac{h}{12} \bigl[ 23f_j - 16f_{j-1} + 5f_{j-2} \bigr] \end{aligned} \]

Example 6.5

Investigate the stability of the Milne’s predictor method:

\[ \begin{aligned} y_{j+4} - y_j ~=~ \frac{4h}{3} \bigl[ 2f_{j+3} - f_{j+2} + 2f_{j+1} \bigr] \end{aligned} \]

Remark 6.2 (Adams methods)

The characteristic polynomial \(\rho\) for all Adams methods (AB and AM) are of the form:

\[\begin{split} \begin{aligned} \rho(z) ~&=~ z^k - z^{k-1}\\ &=~ z^{k-1}(z - 1) \end{aligned} \end{split}\]

Hence, \(z_1 = 1\), and the reminding roots \(z_2,z_3,\dots,z_k = 0\), they all satisfy the root conditions and are zero stable. It can be shown that they are consistent and therefore, all Adams methods are convergent.