## Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

6

4

2

0

1.7 2 3 2.5

(a) Bisection method

1 1 2 3 4 3

4

2

2

4

6

4

2

0

1.7 2 3 2.5

(b) False position method

2

4

F|gure 4.3 Solving the nonlinear equation () tan ) 0.

For this method, we take the larger of x a and b x as the measure of error.

This procedure to search for the solution of ¡ (x) 0 is cast into the MATLAB

routine 'faJsp{)¨.

Note that although the false position method aims to improve the convergence

speed over the bisection method, it cannot always achieve the goal, especially

when the curve of ¡ (x) on [a, b] is not well approximated by a straight line as

depicted in Fig. 4.3. Figure 4.3b shows how the false position method approaches

the solution, started by typing the following MATLAB statements, while Fig. 4.3a

shows the footprints of the bisection method.

>>¡x,err,xx¡ = faJsp{f42,1.7,3,1e-4,5u) ºW11h 1n111aJ 1n1ervaJ ¡1.7,3¡

4.4 NEWTON(-RAPHSON| METHOD

x

o

, for a

nonlinear equation

¡ (x) (x x

o

)

m

g(x) 0

where ¡ (x) has (x x

o

)

m

(m is an even number) as a factor and so its curve

is tangential to the x-axis without crossing it at x x

o

. In this case, the signs

of ¡ (x

o

and ¡ (x

o

a, b]

containing only x

o

as a solution such that ¡ (a)¡ (b) - 0. Consequently, brack-

eting methods such as the bisection or false position ones are not applicable to

this problem. Neither can the MATLAB built-in routine fzero{) be applied to

solve as simple an equation as x

2

0, which you would not believe until you try

it for yourself. Then, how do we solve it? The Newton(-Raphson) method can

NEWTON( -RAPHSONj METHOD 187

be used for this kind of problem as well as general nonlinear equation problems,

¡ (x) exists and is continuous around the solution.

The strategy behind the Newton(-Raphson) method is to approximate the

curve of ¡ (x) by its tangential line at some estimate x

k

, ¡ (x

k

) ¡ (x

k

)(x x

k

) (4.4.1)

and set the zero (crossing the x-axis) of the tangent line to the next estimate

x

k1

.

0 ¡ (x

k

) ¡ (x

k

)(x

k1

x

k

)

x

k1

x

k

¡ (x

k

)

¡ (x

k

)

(4.4.2)

This Newton iterative formula is cast into the MATLAB routine 'neW1on{)¨,

which is designed to generate the numerical derivative (Chapter 5) in the case

where the derivative function is not given as the second input argument.

Here, for the error analysis of the Newton method, we consider the second-

degree Taylor polynomial (Appendix A) of ¡ (x) about x x

k

:

¡ (x) ¡ (x

k

) ¡ (x

k

)(x x

k

)

¡ (x

k

)

2

(x x

k

)

2

func11on ¡x,fx,xx¡ = neW1on{f,df,xu,ToJX,Max11er)

ºneW1on.n 1o soJve f{x) = u by us1ng NeW1on ne1hod.

º1npu1: f = f1n 1o be g1ven as a s1r1ng `f` 1f def1ned 1n an M-f1Je

º df = df{x)1dx {1f no1 g1ven, nuner1caJ der1va11ve 1s used.)

º xu = 1he 1n111aJ guess of 1he soJu11on

º ToJX = 1he upper J1n11 of ]x{k) - x{k-1)]

º Max11er = 1he nax1nun # of 11era11on

ºou1pu1: x = 1he po1n1 Wh1ch 1he aJgor11hn has reached

º fx = f{x{Jas1)), xx = 1he h1s1ory of x

h = 1e-4¦ h2 = 2*h¦ ToJFun=eps¦

1f narg1n == 4 & 1snuner1c{df), Max11er = ToJX¦ ToJX = xu¦ xu = df¦ end

xx{1) = xu¦ fx = fevaJ{f,xu)¦

for k = 1: Max11er

1f -1snuner1c{df), dfdx = fevaJ{df,xx{k))¦ ºder1va11ve func11on

eJse dfdx = {fevaJ{f,xx{k) + h)-fevaJ{f,xx{k) - h))1h2¦ ºnuner1caJ drv

end

dx = -fx1dfdx¦

xx{k+1) = xx{k)+dx¦ ºEq.{4.4.2)

fx = fevaJ{f,xx{k + 1))¦

1f abs{fx)<ToJFun ] abs{dx) < ToJX, break¦ end

end

x = xx{k + 1)¦

1f k == Max11er, fpr1n1f{`The bes1 1n ºd 11era11onsn`,Max11er), end

188 NONLlNEAR EQUATlONS

We substitute x x

o

(the solution) into this and use ¡ (x

o

) 0 to write

0 ¡ (x

o

) ¡ (x

k

) ¡ (x

k

)(x

o

x

k

)

¡ (x

k

)

2

(x

o

x

k

)

2

and

¡ (x

k

) ¡ (x

k

)(x

o

x

k

)

¡ (x

k

)

2

(x

o

x

k

)

2

x

k

as

e

k

x

k

x

o

, we can get

x

k1

x

k

(x

o

x

k

)

¡ (x

k

)

2¡ (x

k

)

(x

o

x

k

)

2

,

e

k1

¡ (x

k

)

2¡ (x

k

)

e

2

k

A

k

e

2

k

A

k

e

k

e

k

(4.4.3)

This implies that once the magnitude of initial estimation error e

0

is small

enough to make Ae

0

- 1, the magnitudes of successive estimation errors get

smaller very quickly so long as A

k

does not become large. The Newton method

is said to be 'quadratically convergent` on account of the fact that the magnitude

of the estimation error is proportional to the square of the previous estimation

error.

Now, it is time to practice using the MATLAB routine 'neW1on{)¨ for solving

a nonlinear equation like that dealt with in Example 4.2. We have to type the

following statements into the MATLAB command window.

>>xu = 1.8¦ ToJX = 1e-5¦ Max11er = 5u¦ ºW11h 1n111aJ guess 1.8,...

>>¡x,err,xx¡ = neW1on{f42,xu,1e-5,5u) º1

s1

order der1va11ve

>>df42 = 1nJ1ne{`-{sec{p1-x)).`2-1`,`x`)¦ º1

s1

order der1va11ve

>>¡x,err,xx1¡ = neW1on{f42,df42,1.8,1e-5,5u)

Remark 4.3. Newton(-Raphson) Method

1. While bracketing methods such as the bisection method and the false posi-

tion method converge in all cases, the Newton method is guaranteed to

converge only in case where the initial value x

0

solution x

o

and A(x) ¡ (x),2¡ (x) x x

o

.

Apparently, it is good for fast convergence if we have small A(x)-that is,

the relative magnitude of the second-order derivative ¡ (x) over ¡ (x) is

small. In other words, the convergence of the Newton method is endangered

if the slope of ¡ (x)

2. Note two drawbacks of the Newton(-Raphson) method. One is the effort

and time required to compute the derivative ¡ (x

k

) at each iteration; the

SECANT METHOD 189

2

1

0

1

2

1.8 2 2.2

(a)

42

( ) tan ( )

10

0

20

10

5 0 10 20 15

(b)

44

( ) (

2

25)( 10) 5

2.4 2.6

0

10

20

30

40

15 10 5 0 5 15 10

1

0 1 2 3

2 3

0

0

0

3

3

2

2

1

1

1

125

1

0

1

2

2

5 0 5 10

(d)

44

( ) tan

1

( 2)

(c)

44

( ) (

2

25)( 10) 5

1

125

F|gure 4.4 Solving nonlinear equations () 0 by using the Newton method.

other is the possibility of going astray, especially when ¡ (x) has an abruptly

changing slope around the solution (e.g., Fig. 4.4c or 4.4d), whereas it con-

verges to the solution quickly when ¡ (x) has a steady slope as illustrated

in Figs. 4.4a and 4.4b.

4.5 SECANT METHOD

the sense that the derivative is replaced by a difference approximation based on

the successive estimates

¡ (x

k

)

¡ (x

k

) ¡ (x

k1

)

x

k

x

k1

(4.5.1)

which is expected to take less time than computing the analytical or numerical

derivative. By this approximation, the iterative formula (4.4.2) becomes

x

k1

x

k

¡ (x

k

)

k

with

k

¡ (x

k

) ¡ (x

k1

)

x

k

x

k1

(4.5.2)

- Elon Musk: Tesla, SpaceX, and the Quest for a Fantastic FutureAshlee Vance
- Dispatches from Pluto: Lost and Found in the Mississippi DeltaRichard Grant
- The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital RevolutionWalter Isaacson
- Yes PleaseAmy Poehler
- The Unwinding: An Inner History of the New AmericaGeorge Packer
- Sapiens: A Brief History of HumankindYuval Noah Harari
- The Emperor of All Maladies: A Biography of CancerSiddhartha Mukherjee
- This Changes Everything: Capitalism vs. The ClimateNaomi Klein
- Grand Pursuit: The Story of Economic GeniusSylvia Nasar
- A Heartbreaking Work Of Staggering Genius: A Memoir Based on a True StoryDave Eggers
- The Prize: The Epic Quest for Oil, Money & PowerDaniel Yergin
- Devil in the Grove: Thurgood Marshall, the Groveland Boys, and the Dawn of a New AmericaGilbert King
- John AdamsDavid McCullough
- The World Is Flat 3.0: A Brief History of the Twenty-first CenturyThomas L. Friedman
- The Hard Thing About Hard Things: Building a Business When There Are No Easy AnswersBen Horowitz
- Rise of ISIS: A Threat We Can't IgnoreJay Sekulow
- The New Confessions of an Economic Hit ManJohn Perkins
- Team of Rivals: The Political Genius of Abraham LincolnDoris Kearns Goodwin
- Smart People Should Build Things: How to Restore Our Culture of Achievement, Build a Path for Entrepreneurs, and Create New Jobs in AmericaAndrew Yang
- Steve JobsWalter Isaacson
- Angela's Ashes: A MemoirFrank McCourt
- How To Win Friends and Influence PeopleDale Carnegie
- Bad Feminist: EssaysRoxane Gay

- You Too Can Have a Body Like Mine: A NovelAlexandra Kleeman
- The Incarnations: A NovelSusan Barker
- The Light Between Oceans: A NovelM.L. Stedman
- The Sympathizer: A Novel (Pulitzer Prize for Fiction)Viet Thanh Nguyen
- Extremely Loud and Incredibly Close: A NovelJonathan Safran Foer
- Leaving Berlin: A NovelJoseph Kanon
- The Silver Linings Playbook: A NovelMatthew Quick
- The MasterColm Toibin
- Lovers at the Chameleon Club, Paris 1932: A NovelFrancine Prose
- Bel CantoAnn Patchett
- A Man Called Ove: A NovelFredrik Backman
- The Flamethrowers: A NovelRachel Kushner
- Brooklyn: A NovelColm Toibin
- We Are Not Ourselves: A NovelMatthew Thomas
- The First Bad Man: A NovelMiranda July
- The Rosie Project: A NovelGraeme Simsion
- The Blazing World: A NovelSiri Hustvedt
- The Love Affairs of Nathaniel P.: A NovelAdelle Waldman
- The Bonfire of the Vanities: A NovelTom Wolfe
- The Perks of Being a WallflowerStephen Chbosky
- The Cider House RulesJohn Irving
- A Prayer for Owen Meany: A NovelJohn Irving
- My Sister's Keeper: A NovelJodi Picoult
- The WallcreeperNell Zink
- The Art of Racing in the Rain: A NovelGarth Stein
- Wolf Hall: A NovelHilary Mantel
- The Kitchen House: A NovelKathleen Grissom
- Beautiful Ruins: A NovelJess Walter
- Interpreter of MaladiesJhumpa Lahiri
- Good in BedJennifer Weiner

Sign up to vote on this title

UsefulNot usefulClose Dialog## Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

Loading