# Ékstrapolasi Richardson

Loncat ke navigasi Loncat ke pencarian

Dina analisis numeris, ékstrapolasi Richardson nyaéta métode akselerasi runtuyan, nu digunakeun pikeun ngabebenah rarata konvergénsi tina runtuyan. Dingaranan sanggeus kapanggih ku Lewis Fry Richardson, nu manggihan téhnik ieu dina mangsa awal abad ka-20.

## Définisi basajan Artikel ieu keur dikeureuyeuh, ditarjamahkeun tina basa Inggris. Bantosanna diantos kanggo narjamahkeun.

Suppose that A(h) is an estimation of order hn for $A=\lim _{h\to 0}A(h)$ , i.e. $A-A(h)=a_{n}h^{n}+O(h^{m}),~a_{n}\neq 0,~m>n$ . Then

$R(h)=A(h/2)+{\frac {A(h/2)-A(h)}{2^{n}-1}}={\frac {2^{n}\,A(h/2)-A(h)}{2^{n}-1}}$ is called the Richardson extrapolate of A(h); it is an estimate of order hm for A, with m>n.

More generally, the factor 2 can be replaced by any other factor, as shown below.

Very often, it is much éasier to obtain a given precision by using R(h) rather than A(h') with a much smaller h' , which can cause problems due to limited precision (rounding errors) and/or due to the incréasing number of calculations needed (see examples below).

## Rumus umum

Let A(h) be an approximation of A that depends on a positive step size h with an error formula of the form

$A-A(h)=a_{0}h^{k_{0}}+a_{1}h^{k_{1}}+a_{2}h^{k_{2}}+\cdots$ where the ai are unknown constants and the ki are known constants such that hki > hki+1.

The exact value sought can be given by

$A=A(h)+a_{0}h^{k_{0}}+a_{1}h^{k_{1}}+a_{2}h^{k_{2}}+\cdots$ which can be simplified with Big O notation to be

$A=A(h)+a_{0}h^{k_{0}}+O(h^{k_{1}}).\,\!$ Using the step sizes h and h / t for some t, the two formulas for A are:

$A=A(h)+a_{0}h^{k_{0}}+O(h^{k_{1}})\,\!$ $A=A\left({\frac {h}{t}}\right)+a_{0}\left({\frac {h}{t}}\right)^{k_{0}}+O(h^{k_{1}}).$ Multiplying the second equation by tk0 and subtracting the first equation gives

$(t^{k_{0}}-1)A=t^{k_{0}}A\left({\frac {h}{t}}\right)-A(h)+O(h^{k_{1}})$ which can be solved for A to give

$A={\frac {t^{k_{0}}A\left({\frac {h}{t}}\right)-A(h)}{t^{k_{0}}-1}}+O(h^{k_{1}}).$ By this process, we have achieved a better approximation of A by subtracting the largest term in the error which was O(hk0). This process can be repéated to remove more error terms to get even better approximations.

A general recurrence relation can be defined for the approximations by

$A_{i+1}(h)={\frac {t^{k_{i}}A_{i}\left({\frac {h}{t}}\right)-A_{i}(h)}{t^{k_{i}}-1}}$ such that

$A=A_{i+1}(h)+O(h^{k_{i+1}}).$ A well-known practical use of Richardson extrapolation is Romberg integration, which applies Richardson extrapolation to the trapezium rule.

It should be noted that the Richardson extrapolation can be considered as a linéar sequence transformation.

## Conto

Using Taylor's theorem,

$f(x+h)=f(x)+f'(x)h+{\frac {f''(x)}{2}}h^{2}+\cdots$ so the derivative of f(x) is given by

$f'(x)={\frac {f(x+h)-f(x)}{h}}-{\frac {f''(x)}{2}}h+\cdots .$ If the initial approximations of the derivative are chosen to be

$A_{0}(h)={\frac {f(x+h)-f(x)}{h}}$ then ki = i+1.

For t = 2, the first formula extrapolated for A would be

$A=2A_{0}\left({\frac {h}{2}}\right)-A_{0}(h)+O(h^{2}).$ For the new approximation

$A_{1}(h)=2A_{0}\left({\frac {h}{2}}\right)-A_{0}(h)$ we can extrapolate again to obtain

$A={\frac {4A_{1}\left({\frac {h}{2}}\right)-A_{1}(h)}{3}}+O(h^{3}).$ ## Rujukan

• Extrapolation Methods. Theory and Practice ku C. Brezinski jeung M. Redivo Zaglia, North-Holland, 1991.