Section 7.3 How good can a polynomial approximation be?
Theorem 7.3.1.
Given points \(x_0,\dots x_{n}\) in \([a,b]\) and a smooth function \(f\text{,}\) then the polynomial \(p(x)\) passing through the \(x_i\) satisfies the inequality
\begin{equation*}
\max_{x\in[a,b]}|f(x)-p(x)|\leq\frac{\displaystyle\max_{x\in[a,b]}\left|\prod_{i=0}^{n}(x-x_i)\right|}{(n+1)!}
\max_{x\in[a,b]}|f^{(n+1)}(x)|.
\end{equation*}
- how large is the rate of change of \(f^{(n)}(x)\text{;}\)
- the position of the interpolation points.
\begin{equation*}
\max_{x\in[a,b]}|f(x)-p(x)|\leq\frac{(b-a)^{n+1}}{(n+1)!}\displaystyle\max_{x\in[a,b]}|f^{(n+1)}(x)|.
\end{equation*}
When the derivatives of \(f\) do not grow much (or at all!) with \(n\text{,}\) the simplified formula above is enough to give us good upper bounds for the error. For instance, take \(f(x)=\sin(x)\) and any set of \(n+1\) nodes in an interval \([a,b]\text{.}\) Then
\begin{equation*}
\max_{x\in[a,b]}|\sin(x)-p(x)|\leq\frac{(b-a)^{n+1}}{(n+1)!}.
\end{equation*}
Subsection 7.3.1 Interpolation at equidistant points
Theorem 7.3.2.
Assume that
\begin{equation*}
x_0=a, x_1=a+h, \dots, x_{n}=a+nh=b,
\end{equation*}
where \(h=\displaystyle\frac{b-a}{n}\text{.}\) Then
\begin{equation*}
\max_{x\in[a,b]}\left|\prod_{i=0}^{n}(x-x_i)\right|\leq\frac{n!}{4}h^{n+1}.
\end{equation*}
\begin{equation*}
\max_{x\in[a,b]}|f(x)-p(x)|\leq\frac{h^{n+1}}{4(n+1)}\max_{x\in[a,b]}|f^{(n+1)}(x)|.
\end{equation*}
Example: Lagrange interpolation of \(\sin x\) (equidistant points). Subsection 7.3.2 The Runge phenomenon
From the error formula above, it might seem that it is enough to increase \(n\) to get an approximation of \(f(x)\) arbitrarily close in terms of Lagrange polynomials based at equidistant points. Unfortunately, it is not so. Indeed, there are smooth functions \(f(x)\) such that, for \(n\to\infty\text{,}\) the derivative \(f^{(n+1)}(x)\) at some point diverges faster than exponential! Hence, the error bound above diverges for \(n\to\infty\text{,}\) namely we lose control on the accuracy of the interpolation. An example of such function, discovered by Runge, is
\begin{equation*}
f(x)=\displaystyle\frac{1}{1+x^2}.
\end{equation*}
In this case, indeed, \(|f^{(2n)}(0)|=(2n)!\text{,}\) i.e. the even-order derivatives at \(x=0\) grow with \(n\) with factorial speed.
Example: Lagrange interpolation of \(\frac{1}{1+9x^2}\) (equidistant points). Subsection 7.3.3 Optimal choice for the interpolating points
One thing we can accomplish to minimize the interpolation error is finding a set of nodes \(\{x_i\}\) that minimizes
\begin{equation*}
\max_{x\in[a,b]}\left|\prod_{i=1}^{n+1}(x-x_i)\right|.
\end{equation*}
A set that accomplishes this is the set of Chebyshev points
\begin{equation*}
x^*_k = \frac{a+b}{2}+\frac{b-a}{2}\cos\left(\frac{2k-1}{2(n+1)}\pi\right)\,,\;k=1,\dots,n+1,
\end{equation*}
For this set of points
\begin{equation*}
\max_{x\in[a,b]}\left|\prod_{i=1}^{n+1}(x-x^*_i)\right|\leq2\left(\frac{b-a}{4}\right)^{n+1},
\end{equation*}
so that
\begin{equation*}
\max_{x\in[a,b]}|f(x)-p(x)|\leq2\frac{((b-a)/4)^{n+1}}{(n+1)!}\max_{x\in[a,b]}|f^{(n+1)}(x)|.
\end{equation*}
The exceptionally nice behavior of this set of points with respect to others, such as equidistance points, can be seen by just plotting the graph of such interpolating polynomials. For instance, the code below produces the graph of the polynomial
\begin{equation*}
(x-x_1)\cdot(x-x_2)\cdot\dots\cdot(x-x_{10})
\end{equation*}
in case of the equidistant points and in case of Chebyshev's ones:
Example: Lagrange interpolation of \(\frac{1}{1+9x^2}\) (Chebyshev points).