You are on page 1of 91

Chapter Three

Greedy Algorithm

1
Example – Counting Money
 Example: To make Birr 6.39, you can
choose:
 a 5 birr bill
 a 1 birr bill, to make 6 birr
 a 25cent coin, to make 6.25
 A 10cent coin, to make 6.35
 four 1cent coins, to make 6.39

25
Spanning Trees
 Spanning tree is a tree (connected
graph with no cycles) that connects
every node in the graph.
 A graph can have more than one
spanning tree.

26
Properties of Spanning Trees
 Property 1: with a connected graph
with V nodes each spanning tree has
precisely (V-1) edges.
 Property 2: Maximally acyclic
 Property 3: Minimally connected
 Property 4: No. of edges removed is (E-
V+1)
 Property 5: No. spanning tree V(V-2) 27
Minimum Spanning Trees
 Let G=(V, E) be a connected,
undirected graph.
 For each edge (u, v) in E, we have a
weight w(u, v) specifying the cost
(length of edge) to connect u and v.

28
Minimum Spanning Trees
 We wish to find a (acyclic) subset T of
E that connects all of the vertices in V
and whose total weight is minimized.
 Since the total weight is minimized,
the subset T must be acyclic (no
circuit).

29
Minimum Spanning Trees
 Thus, T is a tree. We call it a spanning
tree.
 The problem of determining the tree T
is called the minimum-spanning-tree
problem.

30
Minimum Spanning Trees
A

2 8 4 6

E B
6
4 8
7 5

D C
9

31
Minimum Spanning Trees
A

26
6

E B
8
7 5

D C

32
Minimum Spanning Trees
A

22
2

E B
6
5

D C
9

33
Minimum Spanning Trees
A

24
8 4

E B

7 5

D C

34
Minimum Spanning Trees

35
Prim’s Algorithm
 Used to find the minimum spanning
tree.
 It is a greedy algorithm

36
Steps in Prim’s Algorithm
 To find the minimum spanning tree T.
 Step 1: Select any node to be the first of T
 Step 2: Consider which edge connects
nodes in T to nodes outside T. Pick one
with minimum weight (if more than one
choose any). Add this edge & node to T.
 Step 3: Repeat step 2 until T contains
every node of graph.
37
Example
A
Start from node A

2 8 4 6

E B
6
4 8
7 5

D C
9

42
Example
A
Start from node A

2 8 4 6

E B
6
4 8
7 5

D C
9

43
Example
A
Start from node A

2 8 4 6

E B
6
4 8
7 5

D C
9

44
Example
A
Start from node A

2 8 4 6

E B
6
4 8
7 5

D C
9

2 + 7 + 4 + 5 = 18
45
Kruskal’s Algorithm
 Used to find the minimum spanning
tree.

46
Steps in Kruskal’s Algorithm
 To find the minimum spanning tree T.
 Step 1: Choose the edge of least weight
 Step 2: Choose from those edges
remaining the edge of least weight which
do not form a cycle (if more than one
choose any).
 Step 3: Repeat step 2 until (n-1) edges
have been chosen.
47
Example
A
Edge AE is minimum

2 8 4 6

E B
6
4 8
7 5

D C
9

51
Example
A
Edge EC AC are minimum,
choose any one, EC
2 8 4 6

E B
6
4 8
7 5

D C
9

52
Example
A
Edge AC minimum but
makes cycle. So BC is
2 4 6 minimum
8

E B
6
4 8
7 5

D C
9

53
Example
A
Edge AB & EB are minimum
but form cycle. So ED is
2 4 6 minimum
8

E B
6
4 8
7 5

D C
9

2 + 7 + 4 + 5 = 18
54
Single Source Shortest Path
 The problem of finding shortest paths
from a source vertex v to all other
vertices in the graph.
 Weighted graph G = (E,V)
 Source vertex s  V to all vertices v  V

59
Dijkstra’s Algorithm
 Solution to the single-source shortest
path problem in graph theory
 Both directed and undirected graphs
 All edges must have nonnegative weights
 Graph must be connected

60
Dijkstra’s Algorithm
 Input: Weighted graph G = {E,V} and
source vertex v ∈ V, such that all edge
weights are nonnegative
 Output: Lengths of shortest paths (or
the shortest paths themselves) from a
given source vertex v ∈ V to all other
vertices
61
Dijkstra’s Algorithm
dist[s]  0
for all v  V – {s}
do dist[v]  
S
QV
While Q  
do u  minDistance(Q, dist)
S  S u {u}
for all v neighbors[u]
do if dist[v] > dist[u] + w(u, v)
the d[v]  d[u] + w(u, v)
return dist

62
Example
Find the minimum distance b/n s and t.

24
2 3
9

s
18
14
2 6
6
30 4 19
11
15 5
5
6
20 16

t
7 44

63
S={ }
Q = { s, 2, 3, 4, 5, 6, 7, t }



24
2 3
0 9

s
18
14  2 6
6 
30  11
4 19

15 5
5
6
20 16

t
7 44
distance label  
64
S={ }
Q = { s, 2, 3, 4, 5, 6, 7, t }

delmin 

24
2 3
0 9

s
18
14  2 6
6 
30  11
4 19

15 5
5
6
20 16

t
7 44
distance label  
65
S={s}
Q = { 2, 3, 4, 5, 6, 7, t }

decrease key


X
24
2 3
0 9

s
18
14 
X 2 6
6 
30  11
4 19

15 5
5
6
20 16

t
7 44
distance label  
X
66
S={s}
X
Q = { 2, 3, 4, 5, 6, 7, t }

delmin 

X 9
24
2 3
0 9

s
18
14  14
X 2 6
6 
30  11
4 19

15 5
5
6
20 16

t
7 44
distance label  15
X 
67
S = { s, 2 }
Q = { 3, 4, 5, 6, 7, t }



X 9
24
2 3
0 9

s
18
14  14
X 2 6
6 
30  11
4 19

15 5
5
6
20 16

t
7 44
distance label  15 
X
68
S = { s, 2 }
Q = { 3, 4, 5, 6, 7, t }

X 33

X 9
24
2 3
0 9
delmin
s
18
14  14
X 2 6
6 
30  11
4 19

15 5
5
6
20 16

t
7 44
distance label  15
X 
69
S = { s, 2, 6 }
Q = { 3, 4, 5, 7, t }

32
X
X 33

X 9
24
2 3
0 9

s
18
14  14
X 2 6
6 
44
30
X 11
4 19

15 5
5
6
20 16

t
7 44
distance label  15
X 
70
S = { s, 2, 6 }
Q = { 3, 4, 5, 7, t }

32
X
X 33

X 9
24
2 3
0 9

s
18
14  14
X 2 6
6 
44
30
X 11
4 19

15 5
5
6
20 16

t
7 44
distance label  15
X delmin 
71
S = { s, 2, 6, 7 }
Q = { 3, 4, 5, t }

32
X
X 33

X 9
24
2 3
0 9

s
18
14  14
X 2 6
6 
X 35
44
30
X 11
4 19

15 5
5
6
20 16

t
7 44
distance label  15
X 59 
X
72
S = { s, 2, 6, 7 }
Q = { 3, 4, 5, t } delmin

32
X
X 33

X 9
24
2 3
0 9

s
18
14  14
X 2 6
6 
X 35
44
30
X 11
4 19

15 5
5
6
20 16

t
7 44
distance label  15
X 59 
X
73
S = { s, 2, 3, 6, 7 }
Q = { 4, 5, t }

32
X
X 33

X 9
24
2 3
0 9

s
18
14  14
X 2 6
6 
35 34
X X
44
30
X 11
4 19

15 5
5
6
20 16

44
t
7
distance label  15
X X 
51 59 X
74
S = { s, 2, 3, 6, 7 }
Q = { 4, 5, t }

32
X
X 33

X 9
24
2 3
0 9

s
18
14  14
X 2 6
6 
35 34
X X
44
30
X 11
4 19

15 5
5
6
20 16
delmin

44
t
7
distance label  15
X X 
51 59 X
75
S = { s, 2, 3, 5, 6, 7 }
Q = { 4, t }

32
X
X 33

X 9
24
2 3
0 9

s
18
14  14
X 2 6
6 45 X

35 34
X X
44
30
X 11
4 19

15 5
5
6
20 16

44
t
7
distance label X 59
50 51 X 
X
 15
X
76
S = { s, 2, 3, 5, 6, 7 }
Q = { 4, t }

32
X
X 33

X 9
24
2 3
0 9

s
18
14  14
X 2 6
6 45 X
35 34
X X
44
30
X 11
4 19

15 5 delmin
5
6
20 16

44
t
7
distance label  15
X 50 51 X 
X 59 X
77
S = { s, 2, 3, 4, 5, 6, 7 }
Q={t}

32
X
X 33

X 9
24
2 3
0 9

s
18
14  14
X 2 6
6 45 X

35 34
X X
44
30
X 11
4 19

15 5
5
6
20 16

44
t
7
distance label X 59
50 51 X 
X
 15
X
78
S = { s, 2, 3, 4, 5, 6, 7 }
Q={t}

32
X
X 33

X 9
24
2 3
0 9

s
18
14  14
X 2 6
6 45 X

35 34
X X
44
30
X 11
4 19

15 5
5
6
20 16

44
t
7
distance label delmin X 59
50 51 X 
X
 15
X
79
S = { s, 2, 3, 4, 5, 6, 7, t }
Q={ }

32
X
X 33

X 9
24
2 3
0 9

s
18
14  14
X 2 6
6 45 X

35 34
X X
44
30
X 11
4 19

15 5
5
6
20 16

44
t
7
distance label X 59
50 51 X 
X
 15
X
80
S = { s, 2, 3, 4, 5, 6, 7, t }
Q={ }

32
X
X 33

X 9
24
2 3
0 9

s
18
14  14
X 2 6
6 45 X

35 34
X X
44
30
X 11
4 19

15 5
5
6
20 16

44
t
7
distance label X 59
50 51 X 
X
 15
X
81
Bellman Ford Algorithm
 Given a weighted graph G and a source
vertex s, find the shortest (minimum
cost) path from s to every other vertex
in G.
Bellman Ford Algorithm
 Negative link weight: The Bellman-Ford
algorithm works; Dijkstra’s algorithm
doesn’t.
 Time complexity: The Bellman-Ford
algorithm is higher than Dijkstra’s
algorithm
Bellman Ford Algorithm

t x
5
6 ,nil ,nil
-2
-3
0 8
s -4 7
2
7
,nil ,nil
9 z
y
Bellman Ford Algorithm

t x t x
5 5
6 ,nil ,nil 6 6,s -2 ,nil
-2
-3 -3
0 8 0 8
s -4 7 s -4 7
2 2
7 7
,nil ,nil 7,s 9 ,nil
9
y z y z
Bellman Ford Algorithm

t x t x
5 5
6 6,s -2 4,y 6 2,x 4,y
-2
-3 -3
0 8 0 8
s -4 7 s -4 7
2 2
7 7
7,s 9 2,t 7,s 2,t
9
y z z
y
The order of edges examined in each pass:
(t, x), (t, z), (x, t), (y, x), (y, t), (y, z), (z, x), (z, s), (s, t), (s, y)
Bellman Ford Algorithm
t x
5
6 2,x -2 4,y
-3
0 8
s -4 7
2
7
7,s 9 -2,t
y z

After pass 4

The order of edges examined in each pass:


(t, x), (t, z), (x, t), (y, x), (y, t), (y, z), (z, x), (z, s), (s, t), (s, y)
Bellman Ford Algorithm
Bellman-Ford(G, w, s)
1. Initialize-Single-Source(G, s)
2. for i := 1 to |V| - 1 do
3. for each edge (u, v)  E do
4. Relax(u, v, w)
5. for each vertex v  u.adj do
6. if d[v] > d[u] + w(u, v)
7. then return False // there is a negative cycle
8. return True
Bellman Ford Algorithm
Relax(u, v, w)
if d[v] > d[u] + w(u, v)
then d[v] := d[u] + w(u, v)
parent[v] := u

You might also like