You are on page 1of 2

THEOREM: [Fixed point existence and uniqueness theorem]: Given a function g(x) \in C[a,b] and g:

[a.b ] \rightarrow [a,b] then g(x) has a fixed point in [a,b]. If in addition, a positive constant k < 1 exists
with |g'(x)|\le k < 1 \forall x\in(a,b) then the fixed point is unique.

PROOF: If g(a) = a or g(b) = b, then it is clear that the fixed point exists. If this is not true, then it must be
true that g(a) > a and g(b) < b Define h(x)=g(x)-x. Then h is continuous on the interval [a,b] and h(a)=g(a)-
a>0 and h(b)=g(b)-b>0 . The intermediate value theorem implies \exists p\in (a,b) such that h(p) = 0 So
g(p)-p=0 implies p is a fixed point of g.

Suppose also that |g'(x)| \le k < 1 \forall x \in (a,b) is true and that p and q are both fixed points in [a,b]
with p \ne q. By the mean value theorem, there is a number between p and q such that \frac{g(p)-g(q)}
{p-q}=g'(\xi) then |p-q| = |g(p) - g(q)| = |g'(\xi)|p-q||\le k|p-q| < |p-q| which produces a contradiction.
The contradiction comes from the assumption that p\ne q therefore p = q and the fixed point must be
unique.

Fixed point iteration: The iteration p_n = g(p_{n-1}) for n = 0, 1, 2, …… is called a fixed point iteration.

Convergence: The rate, or order, of convergence is how quickly a set of iterations will reach the fixed
point. In contrary to the bisection method, which was not a fixed point method, and had order of
convergence equal to one, fixed point methods will generally have a higher rate of convergence. If the
derivative of the function at the fixed point \ne zero, there will be linear convergence, which is the same
as convergence of order one. If the derivative at the fixed point is equal to zero, it is possible for the
fixed point method to converge faster than order one. If working with an equation which iterates to a
fixed point, it is ideal to find the constant that makes the derivative of the function at the fixed point
equal to zero to ensure higher order convergence.

There are four main types of convergence and divergence of the fixed point method. Let g'(x) equal the
derivative of the function g evaluated at the fixed point x:

Monotonic convergence: There is direct convergence to the fixed point with the fixed point being
strongly attractive. Monotonic convergence occurs when 0 < g'(x) < 1
Oscillating convergence: There is indirect convergence to the fixed point with the fixed point being
weakly attractive. Oscillating convergence occurs when -1 < g'(x) < 0

Monotonic divergence: There is direct divergence from the fixed point with the fixed point being
strongly repellent. Monotonic divergence occurs when g'(x) > 1

Oscillating divergence: There is indirect divergence from the fixed point with the fixed point being
weakly repellent. Oscillating divergence occurs when g'(x) < -1

You might also like