Professional Documents
Culture Documents
Numerical Integration
f (x)dx ≈
f n (x)dx
a
under the straight line connecting f (a) and f (b). An estimate for the local
trun- 2
cation error of a single application of the trapezoidal rule can be obtained using
Taylor series as
1
E t = — f JJ (ξ)(b 3
— a)
where ξ is a value between a 12
and b.
Example: Use the trapezoidal rule to numerically integrate
f (x) = 0.2 + 25x
from a = 0 to b = 2.
Solution: f (a) = f (0) = 0.2, and f (b) = f (2) = 50.2.
f (b) + f (a) 0.2 +
I = (b — = (2 — 0) =
50.2 2
a) × 50.4
The true solution is 2
∫ 2
f (x)dx = (0.2x + 12.5x2)|2 = 0 (0.2 × 2 + 12.5 × 22) — 0 = 50.4
0
Because f (x) is a linear function, using the trapezoidal rule gets the exact solu-
tion.
Example: Use the trapezoidal rule to numerically integrate
f (x) = 0.2 + 25x + 3x2
3
from a = 0 to b = 2.
Solution: f (0) = 0.2, and f (2) =
62.2. f (b) + f 0.2 +
I = (b — = (2 — 0) =
(a) 2 62.22
a) × 62.4
The true solution is
∫ 2
f (x)dx = (0.2x + 12.5x2 + x 3 )| 2 = (0.2 × 2 + 12.5 × 22 + 23) — 0 =
58.4 0
0
The
¯58.4 — ¯ × 100% =
relativ |ϵt | = ¯ ¯
62.458.
e error 6.85%
is 4
4
Multiple-application trapezoidal rule:
Using smaller integration interval can reduce the approximation error. We can
divide the integration interval from a to b into a number of segments and apply
the trapezoidal rule to each segment. Divide (a, b) into n segments of equal
width. Then
∫ b ∫ x ∫ x xn
1 2
I = ∫ f (x)dx = f (x)dx + f (x)dx + . . . + f (x)dx
a x0 x1 xn—1
where a = x 0 < x 1 < . . . < x n = b, and x i — xi—1 = h = nb—a , for i = 1, 2, . . . ,
n.
Substituting the Trapezoidal rule for each integral yields
f (x 0 ) + f (x 1 ) f (x 1 ) + f (x 2 ) f (xn—
1 ) + f (x n ) I ≈ h + h + . . . + h
" 2 # 2
h 2 Σ
n—
= f (x 0 ) + 2 1 f (x i ) + f (x n )
2 i=1
Σ n—
f (x0 ) + 1i = 1 f (x i ) + f n
= (b — a)2
2n(x )
The approximation error using the multiple trapezoidal rule is a sum of the indi-
vidual errors, i.e.,
Σn h 3 J Σn (b — a)3 J
Et = f J (ξ i ) = f J (ξ i )
— i=1
— i=1
12n 3
12
5
Σ n J
J f J (ξ )
Let f =
J i=1n i . Then the approximate error is
(b — J
Et = — 3 2 fJ
a)12n
¯58.4 — ¯ × 100% =
|ϵt | = ¯ ¯
59.458.
1.71%
4
7
2 Simpson’s Rules
Aside from using the trapezoidal rule with finer segmentation, another way to
improve the estimation accuracy is to use higher order polynomials.
Figure 2: Illustration of (a) Simpson’s 1/3 rule, and (b) Simpson’s 3/8 rule
8
can estimate f (x) using Lagrange polynomial interpolation. Then
∫ b x2 ·
(x — x 1 )(x —
I = ∫ f (x)dx ≈ f 0
x )
(x 02 — x 1 )(x0 — 2 (x )
a x0 ¸
(x — x0 )(x — 2 x ) (x — x0 )(x —
+ f (x1 ) + f (x2 ) dx
x )
(x 1 — x 0 )(x 1 — x 2 ) x ) 1
(x 2 — x 0 )(x 2 —
When a = x , bx=1 ) x , (a + b)/2 = x , and h = (b — a)/2,
0 2 1
h f (x0 ) + 4f (x1 ) + f
I ≈ [f (x0 ) + 4f (x1 ) + f (x2 )] = (b —
(x
3 ) 2
a)
6
It can be proved that single segment application of Simpson’s 1/3 rule has a
truncation error of
1 5 (4)
E t = — h f (ξ)
9
where ξ is between a and b. 0
Simpson’s 1/3 rule yields exact results for third order polynomials even though
it is derived from parabola.
Example: Use Simpson’s 1/3 rule to integrate
f (x) = 0.2 + 25x + 3x2 + 8x3
from a = 0 to b = 2.
9
Solution: f (0) = 0.2, f (1) = 36.2, and f (2) =
126.2. f (0) + 4f (1) + f (2) 0.2 + 4 × 36.2 +
I = (b — = 2× =
126.2 6
a) 90.4
The exact integral is 6
∫ 2
f (x)dx = (0.2x+12.5x 2 +x 3 +2x 4 )| 2 =0 (0.2×2+12.5×2 2 +2 3 +2×2 4 )—0 = 90.4
0
Exa
mpl
e:
Use
Sim
pso f (0) + 4f (1) + f (2) 0.2 + 4 × 30.2 +
n’s I = (b — = 2 × =
94.2 6
1/3 a) 6
71.73
The exact integral is
rule
∫ 2
to f (x)dx = (0.2x+12.5x 2 +x 3 +0.4x 5 )| 2 = (0.2×2+12.5×22 +2 3 +0.4×25 )—0 = 71.2
0
inte0
grat relative error is
The
e 71.2 —
|ϵt | = ¯
¯71.73
¯ = 0.7%
¯
f (x) = 0.2 7+1.25x + 3x2 + 2x4
2 10
Multiple-application Simpson’s 1/3 rule
Dividing the integration interval into n segments of equal width, we have
∫ x ∫ x xn
2 4
I = ∫ f (x)dx + f (x)dx + . . . + f
x0 x 2 (x)dx xn—2
13
x=x+h
sum=sum+2*f(x)
end
sum=sum+f(b)
TraEq=(b-
a)*sum/(2*n)
Pseudocode 1 can be used when only a limited number of points are given or
the function is available. Pseudocode 2 is for the case where the analytical
function is available. The difference between the two pseudocodes is that in
Pseudocode 2 neigher the independent nor the dependent variable values are
passed into the function via its argument as in Pseudocode 1. When the
analytical function is available, the function values are computed using calls
to the function being analyzed, f (x).
4 Romberg Integration
Romberg integration is one technique that can improve the results of numerical
integration using error-correction techniques.
Richardson’s extrapolation uses two estimates of an integral to compute a third,
more accurate approximation.
14
can be represented as
I = I ( h ) + E( h )
where I is the exact value of the integral, I ( h ) is the approximation from an n-
segment application of the trapezoidal rule with step size h = (b — a)/n,
and E ( h ) is the truncation error. If we make two separate estimates using step
sizes of h1 and h2 and have exact values for the error, then
I(h 1 ) + E(h 1 ) = I(h 2 ) + E(h 2 )
(1)
b—a
The error of the multiple-application trapezoidal rule can be represented approx-
imately as 12
JJ JJ
2 ¯
where h = (b — a)/n. Assuming E ≈that— f¯ ishconstant
f regardless of step size,
we have
E(h 1 ) h21
E(h 2 ) ≈ h22
Then we have µ ¶2
h1
E(h 1 ) ≈ E(h 2 )
h2
which can be substituted into (1):
µ ¶2
h1
I(h 1 ) + E(h 2 ) = I(h 2 ) + E(h 2 )
h2
15
Then E(h 2 ) can be solved as
I(h 2 ) — I(h 1 )
E ( h 2) =
(h 1 /h 2 ) 2 —
1
This estimate can then be substituted into
I = I(h 2 ) + E(h 2 )
to yield an improved estimate of the integral:
I(h 2 ) —
I ≈ I ( h 2) +
I(h 1 )
(h 1 /h 2is) 2O(h
It can be shown that the error of this estimate — 41). Thus, we have combined
two trapezoidal rule estimates of O(h 2 ) to yield a new estimate of O(h 4 ). For
the special case where the interval is h2 = h1/2, we have
1
4
1 I ≈ I(h ) + [I(h 2 ) — I(h 1 )] = I(h 2 ) —
2
22 — 1 I(h ) 3
1
3
With two improved integrals of O(h 4 ) on the basis of three trapezoidal rule esti-
mates, we can combine them to yield
k—1
an even better value with O(h 6 ).
4 I j+1,k—1 — I j,k—1
I j,k ≈
The Romberg integration algorithm has4the
k—1 general form as
—1
16
where I j + 1 , k — 1 and I j , k — 1 are the more and less accurate integrals, respectively,
and I j , k is the improved integral. The index k signifies the level of the integra-
tion, where k = 1 corresponds to the original trapezoidal rule estimates, k = 2
corresponds to O(h 4 ), k = 3 to O(h 6 ), and so forth.
2 3 5
∫b
Example: f (x) = 0.2 + 25x — 200x4 + 675x — 900x + 400x , a
f (x)dx,
a = 0, b =
find
0.8.
∫ 0.8 ∫ 0.8
Solution:
True solution: I = 0 f (x)dx = 0 (0.2 + 25x — 200x2 + 675x3 — 900x4 +
400x5)dx = 1.640533
n 1 = 1, h1 = 0.8
f (0) =∫0.2, f (0.8) =
0.8 f ( 0 ) + f (0.8)
I0.232
1,1 = 0
f 1(x)dx = 0.8 × 2 = , ϵt =
0.1728 89.5%
n 2 = 2, h2 = b—a 2 = 0.4, (h2 = h1/2)
f (0) = 0.2, f (0.8) = 0.232, f (0.4) =
∫ 0.4 0.8 f ( 0 ) + f (0.4) f (0.4)+f (0.8)
I2.456
2,1 = ∫0 f 1 (x)dx + 0.4
f 1 (x)dx = 0.4 × 2 + 0.4 2 =
ϵt = 34.9% × 1.0688,
n 3 = 4, h3 = b—a
4 = 0.2, (h3 = h2/2)
17
I3 , 1 = 0.2
2 [f (0) + 2f (0.2) + 2f (0.4) + 2f (0.6) + f (0.8)] = 1.4848, ϵt =
9.5%
I 1,2 = = 43 I 2,1 —3 1 1,1
4I2,1—I 1,1
4— I = 34 × 1.0688 —3 1 × 0.1728 = 1.367467,t ϵ =
1 16.6%
I 2,2 = 4I3,14—
—I 2,1 = 4 I 3,1 — 1 I2,1 = 4 × 1.4848 — 1 × 1.0688 = 1.623467,t ϵ
3 3 3 =
1 1.0% 3
18
PART II: Numerical Differentiation
J f (x i ) — f (xi—
f (x i ) ≈
1) h
which is called the first backward finite divided difference.
• The first centered finite divided difference
J
f (x i + 1 ) — f (xi—1 ) =
2f (x i )h + O(h 3 )
J
and f ( x ) can be found as
i
J f (x i + 1 ) — f (xi— 2
f (x i ) = —
1 ) 2h
J O(h )
and f ( x ) can also be approximated as
i
J f (x i + 1 ) — f (xi—
f (x i ) ≈
1) 2h
which is called the first centered finite divided difference.
20
Notice that the truncation error is of the order of h2 in contrast to the forward
and backward approximations that are of the order of h. Therefore, the centered
difference is a more accurate representation of the derivative.
Graphical depiction of (a) forward, (b) backward, and (c) centered finite-divided-difference approximations of the first derivative
21
When h = 0.5, xi—1 = x i — h = 0, and f (xi—1 ) = 1.2; x i = 0.5, f (x i ) =
0.925;
xThe = x i + hdivided
i + 1 forward = 1, and f (x i + 1 ) = 0.2.
difference:
J f (x i + 1 ) — f (x i ) 0.2 —
f (0.5) ≈ = = —
0.925 0.
x i+1 — x i 1.45
The percentage relative error: 5
¯(—0.9125) — (— ¯
|ϵt | = ¯ ¯ × 100% =
1.45) —
58.9%
The backward divided difference: 0.9125
J f (x i ) — f (xi— 0.925 —
f (0.5) ≈ = = —
1 ) 1.2 0.
x i — xi—1 0.55
The percentage relative error: 5
¯(—0.9125) — (— ¯
|ϵt | = ¯ ¯ × 100% =
0.55) —
39.7%
The centered divided difference: 0.9125
J f (x i + 1 ) — f (xi— 0.925 —
f (0.5) ≈ = = —
1) 1.22 ×
x i + 1 — xi—1 1.0
The percentage relative error: 0.5
(—0.9125) — (— ¯
|ϵt | = ¯
¯1.0) — ¯ × 100% =
9.6%
0.9125 22
When h = 0.25, xi—1 = x i — h = 0.25, and f (xi—1) = 1.1035; =
x i f (x i ) = 0.925; x i + 1 = x i + h = 0.75, and f (x i + 1 ) = 0.6363. 0.5,
The forward divided difference:
J f (x i + 1 ) — f (x i ) 0.6363 —
f (0.5) ≈ = = —
0.925 0.2
x i+1 — x i 1.155
The percentage relative error: 5
¯ (—0.9125) — (— ¯ × 100% =
|ϵt | = ¯ ¯
1.155) —
26.5%
The backward divided difference: 0.9125
J f (x i ) — f (xi— 0.925 —
f (0.5) ≈ = = —
1 ) 1.10350.2
x i — xi—1 0.714
The percentage relative error: 5
¯ (—0.9125) — (— ¯ × 100% =
|ϵt | = ¯ ¯
0.714) —
21.7%
The centered divided difference: 0.9125
J f (x i + 1 ) — f (xi— 0.6363 —
f (0.5) ≈ = = —
1) 1.1035
2×
x i + 1 — xi—1 0.934
The percentage relative error: 0.25
(—0.9125) — (—
|ϵt | = ¯ ¯0.934) —
¯ × 100% =
¯
2.4%
0.9125 23
Using centered finite divided difference and small step size achieves lower ap-
proximation error.
J
J f J (x i )
f (x i + 2 ) = f (x i ) + f (x i )(2h) + (2h)2 + O(h 3 ) (2)
2
J
J f J (x i !) 2
f (x i + 1 ) = f (x i ) + f (x i )h + h + O(h 3 ) (3)
2
(2)-(3)×2: !
JJ
f (x i + 2 ) — 2f (x i + 1 ) = —f (x i ) +
f (x i )h 2 + O(h 3 )
JJ
f (x i ) can be found as
J f (x i + 2 ) — 2f (x i + 1 ) + f
f J (x i ) = + O(h)
(x i )
JJ h2
and f (x ) can be approximated as
i
J f (x i + 2 ) — 2f (x i + 1 ) + f
f (x i ) ≈
J
(x i ) 2
h
24
This is the second forward finite divided difference.
• The second backward finite divided difference
J f (x i ) — 2f (xi—1 ) + f (xi—
f J (x i ) = + O(h)
2 )
J h2
and f J (x i ) can be approximated as
J f (x i ) — 2f (xi—1 ) + f (xi—
f J (x i ) ≈
2)
h2
is the second backward finite divided difference.
• The second centered finite divided difference
J
J f J (x i ) 2 f (3) (x i ) 3
f (x i + 1 ) = f (x i ) + f (x i )h + h + h + O(h 4 )
2 3
J (4)
J f J!(x i ) 2 f (3)! (x i ) 3
f (xi—1 ) = f (x i ) — f (x i )h h h + O(h 4 )
2 3
+ — (5)
(4)+(5): ! !
J
f (x i + 1 ) + f (xi—1 ) = 2f (xi ) + f J (xi )h + O(h 4)
2
(6)
JJ
Then f (x ) can be solved from (6) as
i
J f (x i + 1 ) — 2f (x i ) + f (xi—
f (x i ) =
J + O(h 2)
1) 2 h
25
and
J f (x i + 1 ) — 2f (x i ) + f (xi—
f (x i ) ≈
J
1) 2 h
is the second centered finite divided
difference.
7 High-Accuracy Numerical
Differentiation
J
• The secondfforward
J (x i + 1 ) — f (xdivided
finite i) f J (xdifference
i) 2 f (x i + 1 ) — f
f (x i ) = — h + O(h ) = +
h (x i ) h
J f (x i + 2 )2— 2f (x i + 1 ) + f O(h()7)
f J (x) = + O(h) (8)
(x i )
h2
Substitute (8) into (7),
J f (x i + 1 ) — f f (x i + 2 ) — 2f (x i + 1 ) + f
f (x i ) = — + O(h 2) (9)
(x i ) h (x i ) 2h
Then we have
J —f (x i + 2 ) + 4f (x i + 1 ) — 3f
f (x i ) = + O(h 2) (10)
(x i ) 2h
26
• The second backward finite divided difference
J 3f (x i ) — 4f (xi—1 ) + f (xi—
f (x i ) = + O(h 2) (11)
2) 2h
• The second centered finite divided difference
J —f (x i + 2 ) + 8f (x i + 1 ) — 8f (xi—1 ) + f (xi—
f (x i ) = + O(h 4 ) (12)
2) 12h
Example: f (x) = —0.1x4 — 0.15x3 — 0.5x2 — 0.25x + 1.2, x i = 0.5, h =
0.25.
28
8 Richardson Extrapolation
This is to use two derivative estimates to compute a third, more accurate one.
4 h1
D ≈ 1 D(h 2 ) — D(h 1 ), h2 = (13)
3 2
Example: f (x) = —0.1x4 — 3 0.15x3 — 0.5x2 — 0.25x + 1.2, x = 0.5, h =
i 1
0.5,
h2 = 0.25.
With h1, x i + 1 = 1, xi—1 = 0, D(h 1 ) = f (x i + 1 )—f (xi—1) = 0.2—
Solution: 1.2 = —1.0, ϵt =
2h1
—9.6%. 1
ϵt = —2.4%. 0.934375,
29