You are on page 1of 30

Data Structures and Algorithms

(ESO207A)

Lecture 39
• An O() time algorithm for biconnected components of a graph

1
Recap of the Previous Lecture

2
Articulation points and DFS

Theorem1 :
A node x is an articulation point if and only if
• It is the root node with more than one child in the DFS tree.
OR
• It is an internal node and has a child y such that
there is no back edge from subtree(y) to any ancestor of x.

3
DFS traversal of G
DFS(v)
{ Visited(v)  true; DFN[v]  dfn ++;
For each neighbor w of v
{ if (Visited(w) = false)
{ DFS(w) ;
……..;
}
……..;
}
}

DFS-traversal(G)
{
dfn  0;
For each vertex vϵ V { Visited(v) false }
For each vertex v ϵ V { If (Visited(v ) = false) DFS(v) }
}
4
DFN number

DFN[x] :
y v The number at which x gets visited during DFS
1 0
traversal.
10 w
f 2
3
b
4h 12 11 d
g
5 r
u
8

6 9z
s 7
c

5
Necessary and Sufficient condition
for internal node x to be articulation point
root
<5 Theorem1:
An internal node x is articulation point iff
?z x has at least one child y s.t.
no back edge from subtree(y) to ancestor of x.

 No back edge from subtree(y) going to a


x 5 vertex “higher” than x.

y 6
How to define the notion
“higher” than x ?

9 Use DFN numbering

6
Necessary and Sufficient condition
for internal node x to be articulation point
root
<5 Theorem1:
An internal node x is articulation point iff
?z x has at least one child y s.t.
no back edge from subtree(y) to ancestor of x.

x 5 High_pt(y) ≥? DFN(x).
Invent a new
y 6 function
High_pt(v):
DFN of the highest ancestor of v
to which there is a back edge from subtree(v).

7
Theorem:
An internal node x is articulation point iff
it has a child, say y, in DFS tree such that
High_pt(y) ≥ DFN(x).

Good !
But how to transform this Theorem
into an efficient algorithm for
articulation points ?

In order to compute High_pt(v) of a vertex v,


we have to traverse the adjacency lists of all vertices of subtree T(v).
 O() time in the worst case to compute High_pt(v) of a vertex v.
 O() time algorithm 
8
How to compute High_pt(v) efficiently ?
root
Question:
Can we express High_pt(v) in terms of its children
and proper ancestors?

v
Exploit
recursive structure of DFS tree.

9
How to compute High_pt(v) efficiently ?
root
Question: Can we express High_pt(v) in terms of

its children and proper ancestors?

High_pt(v) =
min If w=child(v)
v High_pt(w)
If w = proper
(v,w) ϵ E DFN(w) ancestor of v

10
The novel algorithm
Output : an array AP[] s.t.
AP[v]= true if and only if v is an articulation point.

11
Algorithm for articulation points in a graph G
DFS(v)
{ Visited(v)  true; DFN[v]  dfn ++; High_pt[v]∞ ;
For each neighbor w of v
{ if (Visited(w) = false)
{ DFS(w) ; Parent(w)  v;

……..;
……..;
}

……..;
}
}
DFS-traversal(G)
{ dfn  0;
For each vertex vϵ V { Visited(v) false; }
AP[v] false
For each vertex v ϵ V { If (Visited(v ) = false) DFS(v) }
}
12
Algorithm for articulation points in a graph G
DFS(v)
{ Visited(v)  true; DFN[v]  dfn ++; High_pt[v]∞ ;
For each neighbor w of v
{ if (Visited(w) = false)
{ Parent(w) v; DFS(w);

……..;
High_pt(v)  min( High_pt(v) , High_pt(w) );

……..;
If High_pt(w) ≥ DFN[v] AP[v]  true
}
Else if ( Parent(v) ≠ w )
……..; High_pt(v)  min( DFN(w) , High_pt(v) )
}
}
DFS-traversal(G)
{ dfn  0;
For each vertex vϵ V { Visited(v) false; }
AP[v] false
For each vertex v ϵ V { If (Visited(v ) = false) DFS(v) }
}
13
Conclusion
DFS traversal

Theorem2 : For a given graph G=(V,E),


all articulation points can be computed in O() time.

14
Order notation
Definition: Let f() and g() be any two increasing functions of n.
f() is said to be of the order of g()
if there exist constants c and such that
f() ≤ c g() for all n >

f() = O(g())
c g()

f()

𝑛0

15
Order notation extended
Definition: Let f() and g() be any two increasing functions of n.
f() is said to be lower bounded by g()
if there exist constants c and such that
f() ≥ c g() for all n >

f() = (g())

f()

c g()

𝑛0
()
16
Order notation extended
Observations:
• f() = O(g()) if and only if g() = (f())

One more Notation:


If f() = O(g()) and g() = (f()) , then
g() = (f())

Examples:
• ()
• Time complexity of Quick Sort is ?( log )

• Time complexity of Merge sort is ? ( log )


Radix Sort

An efficient sorting algorithm

19
Digits of an integer

No. of digits = ?6
value of digit =∈{0
? ,…,9 }

No. of digits = 4
value of digit ∈{0 ,…,15 }

It is up to us how we define digit ?


Radix Sort
Input: An array A storing integers, where
(i) each integer has exactly digits.
(ii) each digit has value
(iii) .
Output: Sorted array A.
Running time:
O() in word RAM model of computation.
Extra space:
O()
Important points:
• makes use of a count sort.
• Heavily relies on the fact that count sort is a stable sort algorithm.
Demonstration of Radix Sort through example
A
2 0 1 2 5 8 1 0
1 3 8 5 4 9 6 1
4 9 6 1 5 5 0 1
5 8 1 0 2 0 1 2 =4
2 3 7 3 2 3 7 3 =12
6 2 3 9 9 6 2 4 =10

9 6 2 4 1 3 8 5
8 2 9 9 3 4 6 5
3 4 6 5 7 0 9 8
7 0 9 8 9 2 5 8
5 5 0 1 6 2 3 9
9 2 5 8 8 2 9 9
Demonstration of Radix Sort through example
A
2 0 1 2 5 8 1 0 5 5 0 1
1 3 8 5 4 9 6 1 5 8 1 0
4 9 6 1 5 5 0 1 2 0 1 2
5 8 1 0 2 0 1 2 9 6 2 4
2 3 7 3 2 3 7 3 6 2 3 9
6 2 3 9 9 6 2 4 9 2 5 8
9 6 2 4 1 3 8 5 4 9 6 1
8 2 9 9 3 4 6 5 3 4 6 5
3 4 6 5 7 0 9 8 2 3 7 3
7 0 9 8 9 2 5 8 1 3 8 5
5 5 0 1 6 2 3 9 7 0 9 8
9 2 5 8 8 2 9 9 8 2 9 9
Demonstration of Radix Sort through example
A
2 0 1 2 5 8 1 0 5 5 0 1 2 0 1 2
1 3 8 5 4 9 6 1 5 8 1 0 7 0 9 8
4 9 6 1 5 5 0 1 2 0 1 2 6 2 3 9
5 8 1 0 2 0 1 2 9 6 2 4 9 2 5 8
2 3 7 3 2 3 7 3 6 2 3 9 8 2 9 9
6 2 3 9 9 6 2 4 9 2 5 8 2 3 7 3
9 6 2 4 1 3 8 5 4 9 6 1 1 3 8 5
8 2 9 9 3 4 6 5 3 4 6 5 3 4 6 5
3 4 6 5 7 0 9 8 2 3 7 3 5 5 0 1
7 0 9 8 9 2 5 8 1 3 8 5 9 6 2 4
5 5 0 1 6 2 3 9 7 0 9 8 5 8 1 0
9 2 5 8 8 2 9 9 8 2 9 9 4 9 6 1
Demonstration of Radix Sort through example
A
2 0 1 2 5 8 1 0 5 5 0 1 2 0 1 2 1 3 8 5
1 3 8 5 4 9 6 1 5 8 1 0 7 0 9 8 2 0 1 2
4 9 6 1 5 5 0 1 2 0 1 2 6 2 3 9 2 3 7 3
5 8 1 0 2 0 1 2 9 6 2 4 9 2 5 8 3 4 6 5
2 3 7 3 2 3 7 3 6 2 3 9 8 2 9 9 4 9 6 1
6 2 3 9 9 6 2 4 9 2 5 8 2 3 7 3 5 5 0 1
9 6 2 4 1 3 8 5 4 9 6 1 1 3 8 5 5 8 1 0
8 2 9 9 3 4 6 5 3 4 6 5 3 4 6 5 6 2 3 9
3 4 6 5 7 0 9 8 2 3 7 3 5 5 0 1 7 0 9 8
7 0 9 8 9 2 5 8 1 3 8 5 9 6 2 4 8 2 9 9
5 5 0 1 6 2 3 9 7 0 9 8 5 8 1 0 9 2 5 8
9 2 5 8 8 2 9 9 8 2 9 9 4 9 6 1 9 6 2 4
Can you see where we are exploiting the fact that
Countsort is a stable sorting algorithm ?
Radix Sort
RadixSort(A[...], )
{ For =1 to do
Execute CountSort(A,) …with th digit as the key;
return A;
}
Correctness:

𝒋 …
A number stored in A 

Inductive assertion: 𝒋
At the end of th iteration, …
array A is sorted according to the last digits.

During the induction step, you will have to use the fact
that Countsort is a stable sorting algorithm.
Radix Sort
RadixSort(A[...], )
{ For =1 to do
Execute CountSort(A,) with th digit as the key;
return A;
}
Time complexity:
• A single execution of CountSort(A,) runs in O() time and O() space.
• For ,
 a single execution of CountSort(A,) runs in O() time.
 Time complexity of radix sort = O().
• Extra space used = ?O()
Question: How to use Radix sort to sort integers in range [..] in O() time and O() space ?
Answer:
Time complexity
RadixSort(A[...],
What digit to use )?
bit log 𝟐 O()
bits 𝒕 𝒏 O()
Power of the word RAM model

• Very fast algorithms for sorting integers:


Example: integers in range [..] in O() time and O() space ?

• Lesson:
Do not always go after Merge sort and Quick sort when input is integers.

• Interesting programming exercise (for winter vacation):


Compare Quick sort with Radix sort for sorting long integers.
Problems for winter break
• Local minima in a Grid in time.

• Average Case time complexity of Quick Select is .


– T() : Average Time complexity of QuickSelect(,)

F() =
𝑚 ≤𝑛 , 𝑗 ≤ 𝑚

• Proof of correctness of BFS traversal

29
30

You might also like