Professional Documents
Culture Documents
Tarun Biswas
Dept. of Computer Science & Engineering
IIIT Ranchi
➢ The topic “Analysis of Algorithms” is primarily concerned with
determining the time (complexity) and memory (space)
requirements of an algorithm.
3
➢ Operation counts:
▪ When n > 1, each iteration of the for loop makes one
comparison between two elements of a, and the total number
of element comparisons is n-1.
▪ Therefore, the number of element comparisons is max{n-1, 0}.
▪ The algorithm has the nice property that the operation count is
precisely determined by the problem size.
4
➢ Operation counts:
▪ Example [Sequential Search]: An algorithm that searches a[0:n-1] for the
first occurrence of x.
▪ For the average count, assume that all array elements are distinct
and that each is searched for with equal frequency. The average
count for a successful search is
6
➢ Step counts:
▪ We attempt to account for the time spent in all parts of the
algorithm. As was the case for operation counts, the step count is a
function of the problem size.
▪ A step is any computation unit that is independent of the problem
size. Thus 10 additions can be one step; 100 multiplications can
also be one step; but n additions, where n is the problem size,
cannot be one step.
▪ To determine the step count of an algorithm, we first determine
1. the number of steps per execution (s/e) of each statement and
2. the total number of times (i.e., frequency) each statement is
executed.
▪ Combining above these two quantities gives us the total
contribution of each statement to the total step count. We then add
the contributions of all statements to obtain the step count for the
entire algorithm.
7
➢ Step counts:
Best-case Step Counts (when x = a[0]):
8
➢ Step counts:
Step Counts when x =a [j]:
9
➢ Step counts:
Step Counts when x =a [j]:
▪ To obtain this average, we first obtain the step count for the case
x = a[j] where j is in the range [0, n − 1]. Now we obtain the average
step count for a successful search:
10
➢ Step counts:
• Now suppose that successful searches occur only 80 percent of the
time and that each a[i] still has the same probability of being
searched for.
• The average step count for sequentialSearch is
11
➢ When the input sizes is large enough, only the order of growth
of the running time is relevant, i.e., asymptotic efficiency of
algorithms. Asymptotic means approaching a value or curve
arbitrarily closely.
12
Big-Oh Notation:
O(g(n)) = {f(n) : positive
constants c and n0, such that
n n0, we have 0 f(n) cg(n) }
13
Big-Oh Notation: O(g(n)) = {f(n) : positive constants c and n0,
such that n n0, we have 0 f(n) cg(n) }
14
Big-Oh rule
➢ If is f(n) a polynomial of degree d, then f(n) is O(nd), i.e.,
1. Drop lower-order terms
2. Drop constant factors
➢ Take functions f(n) & g(n), consider only the most significant
term and remove constant multipliers:
f(n) = 5n+3 → g(n) = n
f(n) = 7n+.5n2+2000 → g(n) = n2
f(n) = 300n+12+nlogn → g(n) = n log n
15
Big-Oh and growth rate
➢ The Big-Oh notation gives an asymptotically upper bound on
the growth rate of a function.
➢ The statement “f(n) is O(g(n))” means that the growth rate of
f(n) is no more than the growth rate of g(n).
➢ We can use the Big-Oh notation to rank functions according
to their growth rate.
16
Big-Oh Analysis Example:
The following algorithm computes prefix averages.
Algorithm prefixAverages1(X, n)
Input array X of n integers
Output array A of prefix averages of X #Steps
A new array of n integers 1
for i 0 to n − 1 do n
s X[0] n
for j 1 to i do 1 + 2 + …+ (n − 1)
s s + X[j] 1 + 2 + …+ (n − 1)
A[i] s / (i + 1) n
return A 1
Total steps = (n2 + 2n +2)
18
Big-Omega Notation:
(g(n)) = {f(n) : positive constants
c and n0, such that n n0, we have
0 cg(n) f(n)}
➢ g(n) is an asymptotically lower
bound for f(n).
19
Big-Omega Notation: (g(n)) = {f(n) : positive constants c
and n0, such that n n0, we have 0 cg(n) f(n)}
20
Theta Notation:
(g(n)) = {f(n) : positive constants
c1, c2, and n0, such that n n0, we
have 0 c1g(n) f(n) c2g(n)}
➢For any two functions f(n) and g(n), we have f(n) = (g(n)) if
and only if f(n) = O(g(n)) and f(n) = (g(n)).
21
Theta Notation:
(g(n)) = {f(n) : positive constants c1, c2, and n0, such that n
n0, we have 0 c1g(n) f(n) c2g(n)}
22
Small-Oh Notation:
o(g(n)) = {f(n): c > 0, n0 > 0 such that n n0, we have
0 f(n) < cg(n)}.
fg ab
➢ f (n) = O(g(n)) a b
➢ f (n) = (g(n)) a b
➢ f (n) = (g(n)) a = b
➢ f (n) = o(g(n)) a < b
➢ f (n) = ω (g(n)) a > b
26
Limits
➢lim [f(n) / g(n)] = 0 ⇒ f(n) ∈ o(g(n))
n→
27
Properties:
Transitivity
f(n) = (g(n)) & g(n) = (h(n)) f(n) = (h(n))
f(n) = O(g(n)) & g(n) = O(h(n)) f(n) = O(h(n))
f(n) = (g(n)) & g(n) = (h(n)) f(n) = (h(n))
f(n) = o (g(n)) & g(n) = o (h(n)) f(n) = o (h(n))
f(n) = w(g(n)) & g(n) = ω(h(n)) f(n) = ω(h(n))
Reflexivity Symmetry
f(n) = (f(n)) f(n) = (g(n)) iff g(n) = (f(n))
f(n) = O(f(n))
f(n) = (f(n)) Complementarities
f(n) = O(g(n)) iff g(n) = (f(n))
f(n) = o(g(n)) iff g(n) = ω((f(n))
28
➢ What do you mean by n = O(n2)? How do we interpret such
formulas?
29
➢ What do you mean by n = O(n2)? How do we interpret such
formulas?
▪ When the asymptotic notation stands alone (that is, not
within a larger formula) on the right-hand side of an equation
(or inequality), as in n = O(n2), the equal sign to mean set
membership: n ∈ O(n2).
30
➢ What do you mean by n = O(n2)? How do we interpret such
formulas?
▪ When the asymptotic notation stands alone (that is, not
within a larger formula) on the right-hand side of an equation
(or inequality), as in n = O(n2), the equal sign to mean set
membership: n ∈ O(n2).
31
➢ What do you mean by n = O(n2)? How do we interpret such
formulas?
▪ When the asymptotic notation stands alone (that is, not
within a larger formula) on the right-hand side of an equation
(or inequality), as in n = O(n2), the equal sign to mean set
membership: n ∈ O(n2).
32
➢ What about the formula 2n2 + (n) = (n2)?
33
➢ What about the formula 2n2 + (n) = (n2)?
▪ No matter how the anonymous functions are chosen on
the left of the equal sign, there is a way to choose the
anonymous functions on the right of the equal sign to
make the equation valid.
▪ Thus, our example means that for any function
f(n)∈(n), there is some function g(n) ∈ (n2) such that
2n2 + f(n) = g(n) for all n. In other words, the right-hand
side of an equation provides a rough level of detail than
the left-hand side.
34
Q1: Let f(n) = 3n2 + 4n +5. Which of the following (a-f)
is/are correct?
(a) f(n) = Θ(n2)
(b) f(n) = O(n2)
(c) f(n) = O(n3)
(d) f(n) = (n2)
(e) f(n) = Θ(n3)
(f) f(n) = (n)
39