4.1. Approximation of the First Derivative of a Function#

4.1.1. Taylor Series and nth-Order Approximation#

Let \(f\in C^{\infty} [a,b]\) and assume that \(f(x_{i})\) is known for some point of \(x_{i} \in [a,b]\). The approximation of another point, \(x\) around point \(x_{i}\) using the Taylor series can be expressed as follows

(4.1)#\[\begin{align} f(x) &= f(x_{i})+ {\frac {f'(x_{i})}{1!}}(x-x_{i}) + {\frac {f''(x_{i})}{2!}}(x-x_{i})^{2} %\\ & +{\frac {f'''(x_{i})}{3!}}(x-x_{i})^{3} + \ldots +{\frac {f^{(n)}(x_{i})}{n!}}(x-x_{i})^{n}+\ldots \notag \\ & = \sum_{k = 1}^{\infty} {\frac {f^{(k)}(x_{i})}{k!}}(x-x_{i})^{k}. \end{align}\]

Now, assume that we are interested in using \(x_{i+1} = x_{i} + h\) where \(h = \Delta x = x_{i+1} - x_{i}\). We have,

(4.2)#\[\begin{equation}\label{ts_eq.02} f(x_{i+1}) = \sum_{k = 1}^{\infty} {\frac {f^{(k)}(x_{i})}{k!}}(x_{i+1}-x_{i})^{k} = \sum_{k = 1}^{\infty} {\frac {f^{(k)}(x_{i})}{k!}}h^{k}. \end{equation}\]

In practice, we cannot calculate \(f^{\infty}(x)\). Therefore, this can be truncated for some values of \(n\),

(4.3)#\[\begin{equation}\label{nthApp_Taylor} f(x_{i+1}) = \sum_{k = 1}^{n} {\frac {f^{(k)}(x_{i})}{k!}}h^{k} + \mathcal{O}(h^{n+1}) \approx \sum_{k = 1}^{n} {\frac {f^{(k)}(x_{i})}{k!}}h^{k}. \end{equation}\]

The above is known as the nth-order Taylor series approximation for \(x_{i+1}\).

  • Forward difference: \(\Delta _{h}[f](x)=f(x+h)-f(x).\)

  • Backward difference: \(\nabla _{h}[f](x)=f(x)-f(x-h).\)

  • Central difference: \(\delta _{h}[f](x)=\frac{1}{2}\left(f\left(x+h\right)-f\left(x-h\right)\right).\)

Assume that \(f \in C^2[a,b]\), then using 1st-Order Taylor series approximation for \(x_{i+1}\). We have,

\[\begin{align*} f(x_{i+1}) & = f(x_{i})+ h f'(x_{i}) +\mathcal{O}(h^{2}) \\ & \approx f(x_{i})+ h\,f'(x_{i}). \end{align*}\]

It follows that,

\[\begin{align*} h\,f'(x_{i}) \approx f(x_{i+1}) - f(x_{i}), \end{align*}\]

and

(4.4)#\[\begin{align}\label{ts_eq.03} f'(x_{i}) \approx \frac{f(x_{i+1}) - f(x_{i})}{h}. \end{align}\]

This is also known as Forward Difference approximation of the 1st derivative. Since \(x_{i+1} = x_{i} + h\), we can express this also as follows,

(4.5)#\[\begin{align}\label{ts_eq.04} f'(x_{i}) \approx \frac{f(x_{i} + h) - f(x_{i})}{h}. \end{align}\]

Since \(f(x_{i} + h) - f(x_{i})\) represent the Forward Difference, and it is often shown by

(4.6)#\[\begin{align}\label{ts_eq.05} \Delta _{h}[f](x)=f(x+h)-f(x). \end{align}\]

Thus, the Forward Difference approximation of the 1st derivative can be expressed as follows,

(4.7)#\[\begin{align}\label{ts_eq.06} f'(x_{i}) \approx \frac{\Delta _{h}[f](x)}{h}. \end{align}\]

Similarly, we can also 1st-Order Taylor series approximation for \(x_{i-1} = x_{i} - h\). We have,

(4.8)#\[\begin{align}\label{ts_eq.07} f(x_{i-1}) & = f(x_{i})- h\,f'(x_{i}) +\mathcal{O}(h^{2}) \\ & \approx f(x_{i})- h\,f'(x_{i}). \end{align}\]

Solving the above equation for \(f'(x_{i})\), we have

(4.9)#\[\begin{align}\label{ts_eq.08} f'(x_{i}) \approx \frac{f(x_{i})- f(x_{i-1})}{h} = \frac{f(x_{i})- f(x_{i}-h)}{h} \end{align}\]

This is also known as Backward Difference approximation of the 1st derivative.

Since \(\nabla_{h}[f](x)=f(x)-f(x-h)\) denotes the Backward difference, we have,

(4.10)#\[\begin{align}\label{ts_eq.09} f'(x_{i}) \approx \frac{\nabla_{h}[f](x)}{h} \end{align}\]

Furthermore, consider the 2nd-Order Taylor series approximation for \(x_{i+1} = x_{i} + h\) and \(x_{i-1} = x_{i} - h\). We have,

\[\begin{align*} f(x_{i+1}) & = f(x_{i})+ f'(x_{i})\,h + \frac {f''(x_{i})}{2!}\,h^{2} + \mathcal{O}(h^{3}),\\ f(x_{i-1}) & = f(x_{i})- f'(x_{i})\,h + \frac {f''(x_{i})}{2!}\,h^{2} + \mathcal{O}(h^{3}). \end{align*}\]

It follows from adding the above equations,

\[\begin{align*} f(x_{i+1}) - f(x_{i-1}) = 2\,hf'(x_{i}) + \mathcal{O}(h^{3}). \end{align*}\]

Thus,

\[\begin{align*} f'(x_{i}) \approx \frac{f(x_{i+1}) - f(x_{i-1}) }{2\,h}. \end{align*}\]

This is also known as Central Difference approximation of the 1st derivative. Since \(\delta_{h}[f](x)=\frac{1}{2}\left(f\left(x+h\right)-f\left(x-h\right)\right)\) denotes the Central difference, we have,

\[\begin{align*} f'(x_{i}) \approx \frac{\delta_{h}[f](x)}{2\,h}. \end{align*}\]
Summary
  • Forward difference approximation of the 1st derivative:

\[\begin{align*} f'(x_{i}) \approx \frac{\Delta _{h}[f](x_{i})}{h}= \frac{f(x_{i}+h)-f(x_{i})}{h} = \frac{f(x_{i+1})-f(x_{i})}{h}. \end{align*}\]
  • Backward difference approximation of the 1st derivative:

\[\begin{align*} f'(x_{i}) \approx \frac{\nabla_{h}[f](x_{i})}{h} = \frac{f(x_{i})-f(x_{i}-h)}{h} = \frac{f(x_{i})-f(x_{i-1})}{h}. \end{align*}\]
  • Central difference approximation of the 1st derivative:

\[\begin{align*} f'(x_{i})\approx \frac{\delta_{h}[f](x_{i})}{h} = \dfrac{1}{2\,h}\left(f\left(x_{i}+h\right)-f\left(x_{i}-h\right)\right) = \dfrac{1}{2\,h}\left(f\left(x_{i+1}\right)-f\left(x_{i-1}\right)\right). \end{align*}\]

Example:

Consider \(f(x) = 3x\,\exp(x) - \cos(x) + \sin(x)\). This function is defined and continuous on [0,1]. Approximate \(f'(x)\) at \(x_{0} = 0.5\) using \(h=0.1\) and

a. the forward difference approximation formula for the 1st derivative.

b. the backward difference approximation formula for the 1st derivative.

c. the central difference approximation formula for the 1st derivative.

Solution: We have,

a. Forward difference approximation of the 1st derivative:

\[\begin{align*} f'(x_{0}) \approx \frac{\Delta _{h}[f](x_{0})}{h} = \frac{f(x_{0}+h)-f(x_{0})}{h} = \dfrac{f(0.50 + 0.10) - f(0.50)}{0.10} = 9.44195816. \end{align*}\]

b. Backward difference approximation of the 1st derivative:

\[\begin{align*} f'(x_{0}) \approx \frac{\nabla_{h}[f](x_{0})}{h} = \frac{f(x_{0})-f(x_{0}-h)}{h} = \dfrac{f(0.50) - f(0.50 - 0.10)}{0.10} = 8.16377897. \end{align*}\]

c. Central difference approximation of the 1st:

\[\begin{align*} f'(x_{0}) &\approx \frac{\delta_{h}[f](x_{0})}{h} = \dfrac{1}{2\,h}\left(f\left(x_{0}+h\right)-f\left(x_{0}-h\right)\right) \\& = \dfrac{f(0.50 + 0.10) - f(0.50 - 0.10)}{2(0.10)} = 8.80286857. \end{align*}\]

Moreover, \(f'(x) = 3\,\exp(x) + 3\,x\,\exp(x) + \sin(x) + \cos(x)\). Thus, the exact value of the derivative at \(x_{0} = 0.5\) can be calculated as follows,

\[\begin{align*} f'(x_{0}) = f'(0.5)=8.77625382. \end{align*}\]

In terms of accuracy, the central difference is the most accurate among all as this method benefits from a second-order accuracy (why!?).

\[\begin{align*} \left|f'(x_{0})-\dfrac {\Delta _{h}[f](x_{0})}{h} \right| &= 6.6570e-01,\\ \left|f'(x_{0})-\dfrac {\nabla _{h}[f](x_{0})}{h} \right| &= 6.1247e-01,\\ \left|f'(x_{0})-\dfrac {\delta _{h}[f](x_{0})}{h} \right| &= 2.6615e-02. \end{align*}\]
# This part is used for producing tables and figures
import sys
sys.path.insert(0,'..')
import hd_tools as hd
from bokeh.plotting import show

import numpy as np
import pandas as pd
# f(x)
f = lambda x: 3*x*np.exp(x) - np.cos(x) + np.sin(x)

a = 0
b = 1
h = 0.1
t = 0.5

p = hd.derivative_method_plot(f, a, b, h, labels = ['f(x)', '(xn, f(xn))',])
p.scatter([t], [f(t)], color = 'Lime', fill_alpha=0.4, legend_label = '(t, f(t))', size = 12)
show(p)
Loading BokehJS ...

\(x_{0} = 0.5\) is highlighted with green color in the following figure.

from IPython.display import display, Latex

# Forward difference
Forward = (f(t+h)-f(t))/h
print('Forward difference:')
display(Latex('\dfrac{|f(%.2f + %.2f) - f(%.2f)|}{%.2f} = %.8f' % (t,h,t,h, Forward)))
# Backward difference
Backward = (f(t)-f(t-h))/h
print('Backward difference:')
display(Latex('\dfrac{|f(%.2f) - f(%.2f - %.2f)|}{%.2f} = %.8f' % (t,t,h,h, Backward)))
# Central difference
Central = (f(t+h)-f(t-h))/(2*h)
print('Central difference:')
display(Latex('\dfrac{|f(%.2f + %.2f) - f(%.2f - %.2f)|}{%.2f} = %.8f' % (t,h, t,h,h, Central)))
Forward difference:
\[\dfrac{|f(0.50 + 0.10) - f(0.50)|}{0.10} = 9.44195816\]
Backward difference:
\[\dfrac{|f(0.50) - f(0.50 - 0.10)|}{0.10} = 8.16377897\]
Central difference:
\[\dfrac{|f(0.50 + 0.10) - f(0.50 - 0.10)|}{0.10} = 8.80286857\]
\[f'(t) = 8.77625382\]
\[\left|f'(x)-\dfrac {\Delta _{h}[f](x)}{h} \right| = 6.6570e-01\]
\[\left|f'(x)-\dfrac {\nabla _{h}[f](x)}{h} \right| = 6.1247e-01\]
\[\left|f'(x)-\dfrac {\delta _{h}[f](x)}{h} \right| = 2.6615e-02\]

However, the exact value of the deravitve at this point is

# f'(x)
f1=lambda x: 3*np.exp(x) + 3*x*np.exp(x) + np.sin(x) + np.cos(x)
Exact = f1(t)
display(Latex('''f'(t) = %.8f ''' % Exact))
\[f'(t) = 8.77625382\]
display(Latex('''\\left|f'(x)-\dfrac {\Delta _{h}[f](x)}{h} \\right| = %.4e''' % abs(Forward - Exact)))
display(Latex('''\\left|f'(x)-\dfrac {\\nabla _{h}[f](x)}{h} \\right| = %.4e''' % abs(Backward - Exact)))
display(Latex('''\\left|f'(x)-\dfrac {\delta _{h}[f](x)}{h} \\right| = %.4e''' % abs(Central - Exact)))
\[\left|f'(x)-\dfrac {\Delta _{h}[f](x)}{h} \right| = 6.6570e-01\]
\[\left|f'(x)-\dfrac {\nabla _{h}[f](x)}{h} \right| = 6.1247e-01\]
\[\left|f'(x)-\dfrac {\delta _{h}[f](x)}{h} \right| = 2.6615e-02\]

Moreover, \(f'(x) = 3\,\exp(x) + 3\,x\,\exp(x) + \sin(x) + \cos(x)\). Thus, the exact \(f'(0.5)=8.77625382\). In terms of accuracy, the central difference is the most accurate among all as this method benefits from a second-order accuracy (why!?).

Example:

Consider \(f(x) = 3x\exp(x) - \cos(x) + \sin(x)\). This function is defined and continuous on [0,1]. Discretize \([0,1]\) using \(h=0.1\) and approximate \(f'(x)\) using

  • a. Forward Difference approximation of the 1st derivative at \(\{0, 0.1, 0.2, \ldots,0.8,0.9 \}\).

  • b. Backward Difference approximation of the 1st derivative at \(\{0.1, 0.2, \ldots,0.8,0.9,1\}\).

  • c. Central Difference approximation of the 1st derivative at \(\{0.1, 0.2, \ldots,0.8,0.9\}\).

Solution:

(a): Forward Difference approximation of the 1st derivative at \(\{0, 0.1, 0.2, \ldots,0.8,0.9 \}\)

We can approximate the derivatives as follows,

\[\begin{align*} f'(0) & \approx \frac{f(0.1)-f(0)}{0.1} = 4.36381 ,\\ f'(0.1) & \approx \frac{f(0.2)-f(0.1)}{0.1} = 5.15064 ,\\ f'(0.2) & \approx \frac{f(0.3)-f(0.2)}{0.1} = 6.03612 ,\\ &\vdots \end{align*}\]

Similarly,

xn = np.arange(a, b + h, h)
print('The first n-1 points: %s' % ', '.join([str(round(x,1)) for x in xn[0:-1]]))

p = hd.derivative_method_plot(f, a, b, h, labels = ['f(x)', '(xn, f(xn))',])
p.scatter(xn[0:-1], f(xn[0:-1]), color = 'Lime', fill_alpha=0.4,
          legend_label = 'The first n-1 points', size = 12)
show(p)

Forward = (f(xn[1:]) - f(xn[0:-1]))/h
Table = pd.DataFrame({'xi':xn[0:-1], 'fxi': f(xn[0:-1]), 'der fxi': f1(xn[0:-1]), 'forward der. fxi': Forward})
display(Table.round(6))
The first n-1 points: 0.0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9
xi fxi der fxi forward der. fxi
0 0.0 -1.000000 4.000000 4.363805
1 0.1 -0.563619 4.741902 5.150639
2 0.2 -0.048556 5.575786 6.036122
3 0.3 0.555057 6.515306 7.034903
4 0.4 1.258547 7.576143 8.163779
5 0.5 2.074925 8.776254 9.441958
6 0.6 3.019121 10.136148 10.891355
7 0.7 4.108256 11.679199 12.536914
8 0.8 5.361948 13.431984 14.406977
9 0.9 6.802645 15.424675 16.533688
  • (b) Backward Difference approximation of the 1st derivative at \(\{0.1, 0.2, \ldots,0.8,0.9,1\}\).

xn = np.arange(a, b + h, h)
print('The last n-1 points: %s' % ', '.join([str(round(x,1)) for x in xn[1:]]))

p = hd.derivative_method_plot(f, a, b, h, labels = ['f(x)', '(xn, f(xn))',])
p.scatter(xn[1:], f(xn[1:]), color = 'Lime', fill_alpha=0.4,
          legend_label = 'The last n-1 points', size = 12)
show(p)

Backward = (f(xn[1:]) - f(xn[0:-1]))/h
Table = pd.DataFrame({'xi':xn[1:], 'fxi': f(xn[1:]), 'der fxi': f1(xn[1:]), 'backward der. fxi': Backward})
display(Table.round(6))
The last n-1 points: 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0
xi fxi der fxi backward der. fxi
0 0.1 -0.563619 4.741902 4.363805
1 0.2 -0.048556 5.575786 5.150639
2 0.3 0.555057 6.515306 6.036122
3 0.4 1.258547 7.576143 7.034903
4 0.5 2.074925 8.776254 8.163779
5 0.6 3.019121 10.136148 9.441958
6 0.7 4.108256 11.679199 10.891355
7 0.8 5.361948 13.431984 12.536914
8 0.9 6.802645 15.424675 14.406977
9 1.0 8.456014 17.691464 16.533688
  • (c) Central Difference approximation of the 1st derivative at \(\{0.1, 0.2, \ldots,0.8,0.9\}\).

xn = np.arange(a, b + h, h)
print('The middle points: %s' % ', '.join([str(round(x,1)) for x in xn[1:-1]]))

p = hd.derivative_method_plot(f, a, b, h, labels = ['f(x)', '(xn, f(xn))',])
p.scatter(xn[1:-1], f(xn[1:-1]), color = 'Lime', fill_alpha=0.4, legend_label = 'The middle points: %s', size = 12)
show(p)

Central = (f(xn[2:]) - f(xn[:-2]))/(2*h)
Table = pd.DataFrame({'xi':xn[1:-1], 'fxi': f(xn[1:-1]), 'der fxi': f1(xn[1:-1]), 'central der. fxi': Central})
display(Table.round(6))
The middle points: 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9
xi fxi der fxi central der. fxi
0 0.1 -0.563619 4.741902 4.757222
1 0.2 -0.048556 5.575786 5.593381
2 0.3 0.555057 6.515306 6.535513
3 0.4 1.258547 7.576143 7.599341
4 0.5 2.074925 8.776254 8.802869
5 0.6 3.019121 10.136148 10.166657
6 0.7 4.108256 11.679199 11.714135
7 0.8 5.361948 13.431984 13.471946
8 0.9 6.802645 15.424675 15.470333

4.1.2. Order of Convergence#

Limit of a function

The numerical solution \(u_h\) is considered to be nth-order accurate if the error

(4.11)#\[\begin{equation} E(h)=\|u-u_h\| \end{equation}\]

is proportional to the step-size \(h\) to the nth power [Ciarlet, 2002].

Using the big \(\mathcal{O}\) notation an nth-order accurate numerical method is notated as

\[\begin{equation*} \|u-u_{h}\|=\mathcal{O}(h^{n}) \end{equation*}\]

Let \(f_{i+j}\) be \(f(x_i+ jh)\) for \(j \in \mathbb{Z}\). Using Taylor’s theorem, we have,

(4.12)#\[\begin{align} f_{i+1} & = f_{i} + hf_{i}' + \frac{1}{2} h^2 f_{i}'' + \frac{1}{6} h^3 f_{i}^{(3)} + \mathcal{O}(h^4),\\ f_{i-1} & = f_{i} - hf_{i}' + \frac{1}{2} h^2 f_{i}'' - \frac{1}{6} h^3 f_{i}^{(3)} + \mathcal{O}(h^4). \end{align}\]

Therefore,

  • Forward difference: We have,

(4.13)#\[\begin{align} \frac {1}{h}\Delta _{h}[f](x_{i}) &= \frac {1}{h}\left(f(x_{i}+h)-f(x_{i})\right) = \frac {1}{h}\left(f_{i+1}-f_{i}\right)=\frac {1}{h}\left(hf_{i}' + \frac{1}{2} h^2 f_{i}'' + \frac{1}{6} h^3 f_{i}^{(3)} + \mathcal{O}(h^4) \right) \notag \\ &=f_{i}' + \frac{1}{2} h f_{i}'' + \frac{1}{6} h^2 f_{i}^{(3)} + \mathcal{O}(h^3) \end{align}\]

and

(4.14)#\[\begin{equation} \left(\frac {\Delta _{h}[f](x_i)}{h} - f'(x_i) \right) = {\frac{1}{2} {h} f_{i}''} + \frac{1}{6} h^2 f_{i}^{(3)} + \mathcal{O}(h^3) \end{equation}\]

Thus,

(4.15)#\[\begin{equation} \left|\frac {\Delta _{h}[f](x_i)}{h} - f'(x_i) \right| = \mathcal{O}(h), \end{equation}\]

This shows that \(\dfrac {\Delta _{h}[f](x_i)}{h}\) is a \textbf{first-order} approximation of \(f'(x)\).

  • Backward difference: Similarly,

(4.16)#\[\begin{equation} \left(\frac {\nabla _{h}[f](x_i)}{h} - f'(x_i) \right) = {\frac{1}{2} {h} f_{i}''} - \frac{1}{6} h^2 f_{i}^{(3)} + \mathcal{O}(h^3) \end{equation}\]

Therefore,

(4.17)#\[\begin{equation} \left|\frac {\nabla _{h}[f](x_i)}{h} - f'(x_i) \right| = \mathcal{O}(h), \end{equation}\]

This shows that \(\dfrac {\nabla _{h}[f](x_i)}{h}\) is a \textbf{first-order} approximation of \(f'(x)\).

  • Central difference: Also,

(4.18)#\[\begin{equation} \left(\frac {\delta _{h}[f](x_i)}{h} - f'(x_i) \right) = {\frac{1}{3} {h^2} f_{i}^{(3)}} + \mathcal{O}(h^3) \end{equation}\]

Thus,

(4.19)#\[\begin{equation} \left|\frac {\delta _{h}[f](x_i)}{h} - f'(x_i) \right| = \mathcal{O}(h^2). \end{equation}\]

This shows that \(\dfrac {\delta _{h}[f](x_i)}{h}\) is a \textbf{second-order} approximation of \(f'(x)\). \end{itemize}

4.1.2.1. A practical method for showing the order of convergence numerically#

Let \(e_{\text{new}}\) and \(e_{\text{old}}\) demonstrate the absolute error corresponding to \(h_{\text{new}}\) and \(h_{\text{old}}\) where \(h\) here is the stepsize. Then, to estimate the order of convergence numerically, for a numerical method, we pick step sizes \(h_{\text{new}}\) and \(h_{\text{old}}\) and compute the consequential errors \(e_{\text{new}}\) and \(e_{\text{old}}\). The order of convergence can be estimated by the following formula \cite{senning2007computing}:

(4.20)#\[\begin{equation} \alpha \approx \frac {\log(e_{\text{old}}/e_{\text{new}})}{\log(h_{\text{old}}/h_{\text{new}})}. \end{equation}\]

Observe that if \(h_{\text{new}} = \dfrac{1}{2} h_{\text{old}}\), then

(4.21)#\[\begin{equation} \alpha \approx \frac {\log(e_{\text{old}} / e_{\text{new}})}{\log(2)} = \log_{2} (e_{\text{old}}/e_{\text{new}}). \end{equation}\]

We determine the numerical order of convergence of the forward, backward, and central difference approximations of the first derivative.

  • Forward difference: \(E_{h}^{F} = \max_{i}\left|\dfrac {\Delta _{h}[f](x_i)}{h} - f'(x_i) \right|,\)

  • Backward difference: \(E_{h}^{B} = \max_{i}\left|\dfrac {\nabla _{h}[f](x_i)}{h} - f'(x_i) \right|,\)

  • Central difference: \(E_{h}^{C} = \max_{i}\left|\dfrac {\delta _{h}[f](x_i)}{h} - f'(x_i) \right|.\)

Then, It follows that,

\[\begin{equation*} \begin{cases} \dfrac{E_{h}^{F}}{E_{h/2}^{F}} \approx 2^1,\\ \dfrac{E_{h}^{B}}{E_{h/2}^{B}} \approx 2^1,\\ \dfrac{E_{h}^{C}}{E_{h/2}^{C}} \approx 2^2 \end{cases} \quad \Rightarrow \quad \begin{cases} \alpha_{\text{F}} = \log_{2}\left(\dfrac{E_{h}^{F}}{E_{h/2}^{F}} \right),\\ \alpha_{\text{B}} = \log_{2}\left(\dfrac{E_{h}^{B}}{E_{h/2}^{B}} \right),\\ \alpha_{\text{C}} = \log_{2}\left(\dfrac{E_{h}^{C}}{E_{h/2}^{C}} \right). \end{cases} \end{equation*}\]

From the theoretical part, we expect to see \(\alpha_{\text{F}} \approx 1\), \(\alpha_{\text{B}} \approx 1\), and \(\alpha_{\text{C}} \approx 2\).

Example: For the above example, test the order of convergence of the forward difference, the backward difference, and the central difference numerically.

Solution: We create a table in the form of the following table for this Example. Because of the nature of this question, it would be simpler to try this in Python, MATLAB, or another programming language.

import numpy as np
import pandas as pd
# f(x) and f'(x)
f = lambda x: 3*x*np.exp(x) - np.cos(x) + np.sin(x)
f1=lambda x: 3*np.exp(x) + 3*x*np.exp(x) + np.sin(x) + np.cos(x)
a = 0; b = 1; h = 0.1; t = 0.5;
h = [2**(-i) for i in range(3, 14)]
Cols = ['h', 'EhF', 'EhB', 'EhC']
Table = pd.DataFrame(np.zeros([len(h), len(Cols)], dtype = float), columns=Cols)
Table[Cols[0]] = h
for i in range(len(h)):
    xn = np.arange(a, b, h[i])
    yn = f(xn)
    Mid = np.arange(1, len(xn)-1)
    # Exact
    Exact = f1(xn[Mid])
    # Forward difference
    Forward = (yn[Mid+1]-yn[Mid])/h[i]
    Table[Cols[1]][i] = max(abs(Forward - Exact))
    # Backward difference
    Backward = (yn[Mid]-yn[Mid-1])/h[i]
    Table[Cols[2]][i] = max(abs(Backward - Exact))
    # Central difference
    Central = (yn[Mid+1]-yn[Mid-1])/(2*h[i])
    Table[Cols[3]][i] = max(abs(Central - Exact))
    del Forward, Backward, Central, Exact, xn, yn, Mid
Table.insert(1, 'N', ((b-a)/Table['h']).astype(int))

display(Table.style.set_properties(subset=['h', 'N'], **{'background-color': 'PaleGreen', 'color': 'Black',
       'border-color': 'DarkGreen'}).format(dict(zip(Table.columns.tolist()[-3:], 3*["{:.4e}"]))))
  h N EhF EhB EhC
0 0.125000 8 1.1556e+00 1.0387e+00 5.8418e-02
1 0.062500 16 6.6022e-01 6.2573e-01 1.7244e-02
2 0.031250 32 3.5300e-01 3.4364e-01 4.6824e-03
3 0.015625 64 1.8253e-01 1.8009e-01 1.2198e-03
4 0.007812 128 9.2813e-02 9.2191e-02 3.1130e-04
5 0.003906 256 4.6799e-02 4.6641e-02 7.8629e-05
6 0.001953 512 2.3498e-02 2.3458e-02 1.9759e-05
7 0.000977 1024 1.1774e-02 1.1764e-02 4.9523e-06
8 0.000488 2048 5.8931e-03 5.8906e-03 1.2397e-06
9 0.000244 4096 2.9481e-03 2.9475e-03 3.1012e-07
10 0.000122 8192 1.4744e-03 1.4743e-03 7.7555e-08
hd.derivative_ConvergenceOrder(vecs = [Table['EhF'].values, Table['EhB'].values, Table['EhC'].values],
                               labels = ['Forward Difference', 'Backward Difference', 'Central Difference'],
                               xlabel = r"$$i$$",
                               ylabel = r"$$\log_{2} \left( E_{h_{i}} / E_{h_{i-1}}  \right)$$",
                               title = 'Order of accuracy: Forward, Backward and Central Differences',
                               legend_orientation = 'horizontal', ylim = [0, 2.1])