Professional Documents
Culture Documents
Dr/Mohamed Behiry
LEARNING OUTCOMES
To be able to:
Carry out simple asymptotic analysis of algorithms
Explain the use of big O, omega, and theta notation to describe the efficiency of an algorithm
Use big O, omega, and theta notation to give asymptotic upper, lower, and tight bounds on time and
space complexity of algorithms
Determine the time and space complexity of simple algorithms
2
SCOPE
Efficiency goals
The concept of algorithm analysis
The concept of asymptotic complexity
Rules for using big-O
Comparing various growth functions
How to determine complexity of code structures
3
INTRODUCTION
What is an algorithm?
Ideas behind computer programs
Specified set of simple instructions to be followed to solve a problem.
Fixed: stays the same no matter
Which type of hardware it is running on
Which programming language used to implement it
Must specifies:
The set of input
The desired properties of the output
4
IMPORTANT PROPERTIES OF ALGORITHMS
An algorithm must be
Correct
Always produce the desired output for legal instances of the problem
Efficient
Measured in terms of time or space
Time tends to be more important
The running time analysis allows us to improve algorithms
5
EXPRESSING ALGORITHMS
An algorithm may be expressed in a number of ways:
More precise
• a shorthand for specifying algorithms; leaves out
Pseudo-code the implementation details; focus on the essence of
the algorithm
High-level
programming • requires expressing low-level details that are not
languages necessary for a high-level understanding
6
Pseudocode cannot be compiled or run like a regular program
Example.
PSEUDOCODE
7
The efficiency of an algorithm is usually expressed in terms
of CPU time
IMPORTANT The analysis of algorithms involves categorizing an
PROPERTIES OF algorithm in terms of efficiency
An everyday example: washing dishes
ALGORITHMS Suppose washing a dish takes 30 seconds and drying a
dish takes an additional 30 seconds
Therefore, n dishes require n minutes to wash and dry
8
ANALYSIS OF ALGORITHMS
Why analyzes algorithms?
The same problem can be solved with different algorithms which differ in efficiency. We
analyze the algorithms to
Evaluate algorithm performance
Compare different algorithms
We focus on analyzing :
Running time
Memory usage
Worst-case and typical case
9
ANALYSIS OF ALGORITHMS- CONT.
If each line takes constant time the whole algorithm will take constant time,
right? Wrong
The number of steps performed of an algorithm varies based on the size of
instance, called problem or input size
The efficiency of an algorithm is always stated as a function of the problem
size
We generally use the variable N to represent the problem size
10
PROBLEM SIZE
For every algorithm we want to analyze, we need to define the size of the
problem
The dishwashing problem has a size n – number of dishes to be washed/dried
For a search algorithm, the size of the problem is the size of the search pool
For a sorting algorithm, the size of the program is the number of elements to be sorted
11
PROBLEM/INPUT SIZE MATTERS!
Some example algorithms
and their expected running
times based on the input size
12
MACHINE INDEPENDENCE
The evaluation of efficiency should be machine independent
It is not useful to measure how fast the algorithm runs as this depends on which
computer, OS, programming language, compiler, and kind of inputs are used in
testing
Instead,
we count the number of basic operations the algorithm performs.
A basic operation is an operation which takes a constant amount of time to execute.
we calculate how this number depends on the size of the input.
The efficiency of an algorithm is the number of basic operations it performs. This number is a function
of the input size n.
13
Arithmetic operations: *, /, %, +, -
Assignment statements of simple data types
Reading of primitive types
BASIC Method call (Note: the execution time of the method itself may
depend on the value of parameter and it may not be constant)
14
GROWTH FUNCTIONS
We must also decide what we are trying to efficiently optimize
time complexity – CPU time
space complexity – memory space
15
ALGORITHM COMPLEXITY
Worst Case Complexity:
The function defined by the maximum number of steps taken on any instance of size n
16
ALGORITHM COMPLEXITY- CONT.
We are usually interested in determining the largest
number of operations that might be performed for a
given problem size.
◼ Best case depends on the input
◼ Average case is difficult to compute
◼ So, we usually focus on worst case analysis
Easier to compute
Usually close to the actual running time
17
ALGORITHM COMPLEXITY- CONT.
◼ Strategy:
◼ Try to find upper and lower bounds of the
worst-case function
18
ALGORITHM COMPLEXITY- CONT.
Example: Linear Search Complexity
Best Case: Item found at the beginning: One
Method Worst Case Average Case
comparison
Worst Case: Item found at the end: n Selection sort n2 n2
comparisons Insertion sort n2 n2
Average Case: Item may be found at index 0, or
Merge sort n log n n log n
1, or 2, . . . or n - 1
Quick sort n2 n log n
Average number of comparisons is: (1 + 2 + . .
. + n) / n = (n + 1) / 2
Worst and Average complexities of common
sorting algorithms:
19
RUNNING TIME ANALYSIS
20
ASYMPTOTIC ANALYSIS
It is not typically necessary to know the exact growth function for an algorithm
Finding the exact complexity, f(n) = number of basic operations, of an algorithm is
difficult
We are mainly interested in the asymptotic complexity of an algorithm – the general
nature of the algorithm as n increases
Asymptotic analysis of an algorithm describes the relative efficiency of an algorithm
as n get very large.
When you're dealing with small input size, most of algorithms will do
When the input size is very large things change
E.g. : For very large n, algorithm 1 grows faster than algorithm 2.
21
ASYMPTOTIC NOTATIONS
Commonly used asymptotic notations to calculate the running time complexity of an algorithm.
Ο Notation. O(expression) gives an upper bound on the growth rate of a function. A function f∈O(g)
means, f grows at most as fast as g, asymptotically and up to a constant factor.
Ω Notation. Ω(expression) gives a lower bound on the growth rate of a function. A function f∈Ω(g)
means f grows, asymptotically, at least as fast as g.
θ Notation. θ(expression) consist of all the functions that lie in both O(expression) and Ω(expression).
A function f∈Θ(g) means that f grows, asymptotically, as fast as g.
22
BIG-O NOTATION: DEFINITION
Big O:
T(n) = O(f(n)) if there are positive constants c and N such
that T(n) ≤ c f(n) when n ≥ N
This says that function T(n) grows at a rate no faster than f(n) ; thus f(n) is an upper bound on
T(n). f(n) is O(g(n)) there exist numbers c, N > 0
Another way: such that for each n ≥ N
f(n) ≤ c g(n)
The meaning:
• f(n) is larger than g(n) only for finite number of n’s;
• a constant c and a value N can be found so that for every value of n ≥ N: f(n) ≤ c g(n);
• f(n) does not grow more than a constant factor faster than g(n).
23
BIG-O NOTATION: ILLUSTRATION
c f(n)
N n
24
Prove that 7n3+2n2 = O(n3)
25
BIG-Ω NOTATION: DEFINITION
Big Omega:
T(n) = Ω(f(n)) if there are positive constants c and N
such that T(n) ≥ c f(n) when n ≥ N
This says that function T(n) grows at a rate no slower than f(n) ; thus f(n) is a lower
bound on T(n).
f(n) is Ω(g(n)) there exist numbers c, N > 0
Another way: such that for each n ≥ N
f(n) ≥ c g(n)
26
BIG-Ω NOTATION:
ILLUSTRATION
T(n) = Ω(f(n))
27
Prove that 2n+5n2 = (n2)
28
TIGHTER LOWER BOUND
29
BIG-Θ NOTATION: DEFINITION
Big Theta:
T(n) = θ(f(n)) if and only if
T(n) = O(f(n)) and T(n) = Ω(f(n))
This says that function T(n) grows at the same rate as f(n) .
Another way: f(n) is θ(g(n)) there exist numbers c1, c2, N > 0
such that for each n ≥ N
c1 g(n) ≤ f(n) ≤ c2 g(n)
c1 g(n)
N n
31
BIG- If two functions f and g are proportional, then f(N) = (g(n)
Since logAn = logBn / logBA
NOTATION: Then: logAn = (logBn)
EXAMPLE The base of the log is irrelevant
32
Definition:
T(n) = o(f(n)) if and only if
33
Class name
O(1) constant
O(log n)
bi-logarithmic or log log n
logarithmic or log n
O(n) linear
O(n2) quadratic
O(n3) cubic
O(n!) factorial
O(nn) hyper-exponential
34
RULES FOR USING BIG-O
For large values of input n, the constants and terms with lower degree of n are
ignored.
Example:
35
2. Addition Rule: Ignoring smaller terms.
If O(f(n)) < O(h(n)), then O(f(n) + h(n)) =
O(h(n)).
i.e. If T1(n) = O(f(n)) and T2(n) = O(g(n)), then
Example 1:
36
3. Multiplication Rule:
O(f(n) * h(n)) = O(f(n)) * O(h(n))
CONT. Example
O((n3 + 2n2 + 3n log n + 7) (8n2 + 5n + 2)) = O(n5)
37
COMPARING A hierarchy of growth rates:
GROWTH
FUNCTIONS- c < log4n < log3n < log2n < log2n <
logkn < n < nlog2n < n2 < n3 < 2n <
CONT. 3n < n! < nn
38
COMPARING
GROWTH
FUNCTIONS-
CONT.
As n increases, the various growth
functions diverge dramatically
39
Loops: for, while, and do-while:
First determine the order of the body of the loop, then
multiply that by the number of times the loop will execute
for (int count = 0; count < n; count++)
// some sequence of O(1) steps
ANALYZING N loop executions times O(1) operations results in a O(n)
efficiency
40
ANALYZING LOOP EXECUTION- CONT.
Loops: for, while, and do-while:
Again: complexity is determined by the number of iterations in the loop times multiplied by the
complexity of the body of the loop.
for (int i = 0; i < n; i++)
Examples: sum = sum - i; O(n)
i=1;
while (i < n) {
sum = sum + i; O(log n)
i = i*2
}
41
ANALYZING LOOP EXECUTION- CONT.
◼ We start by considering how to count operations in for-loops.
◼ First, we should know the number of iterations of the loop; say it is x.
Then the loop condition is executed x + 1 times.
Each of the statements in the loop body is executed x times.
The loop-index update statement is executed x times.
Time Units to Compute:
------------------------
◼ Example: • 1 for the assignment
• Loop Statement: 1 assignment, n+1
int sum (int n)
tests, and n increments
{
• Loop Body: n loops of 3 units for:
int partial_sum = 0;
(an assignment, an addition, and
int i;
multiplications)
for (i = 1; i <= n; i++)
• 1 for the return statement
partial_sum = partial_sum + (i * i);
------------------------
return partial_sum;
Total: 1 + (1 + n + 1 + n) + 3n + 1
} = 5n + 4 = O(n)
42
ANALYZING LOOP EXECUTION- CONT.
Loop example:
Find the exact number of basic operations in the following program fragment:
43
ANALYZING LOOP EXECUTION- CONT.
Analyzing Nested Loops:
When loops are nested, we multiply the complexity of the outer loop by the complexity of the
inner loop
for (int count = 0; count < n; count++)
for (int count2 = 0; count2 < n; count2++)
{
n
n // some sequence of O(1) steps
}
44
ANALYZING LOOP EXECUTION- CONT.
Nested Loops: Complexity of inner loop * complexity of outer loop
sum = 0
Examples: for(int i = 0; i < n; i++)
O(n2)
for(int j = 0; j < n; j++)
sum += i * j ;
i = 1;
while(i <= n) {
j = 1;
while(j <= n){
statements of constant complexity
n Log n j = j*2;
O(n log n)
}
i = i+1;
}
45
ANALYZING LOOP EXECUTION- CONT.
Nested Loops: Complexity of inner loop * complexity of outer loop
for(int i = 1; i <= n; i++)
Example: 3mn = O(mn)
for(int j = 1; j <= m; j++)
sum = sum + i + j;
46
ANALYZING SEQUENCE OF STATEMENTS
Consecutive statements: Use Addition rule
These just add, and the maximum is the one that counts
O(s1, s2, s3, … ,sk) = O(s1) + O(s2) + O(s3) + … + O(sk)= O(max(s1, s2, s3,… sk))
Example:
for (int j = 0; j < n * n; j++)
sum = sum + j;
for (int k = 0; k < n; k++)
sum = sum - l;
System.out.print("sum is now ” + sum);
47
ANALYZING SEQUENCE OF STATEMENTS- CONT.
Consecutive statements: Use Addition rule
Example:
for (i = 1; i <= n; i++)
O(n)
sum = sum + i;
48
ANALYZING SEQUENCE OF STATEMENTS- CONT.
Consecutive statements: Use Addition rule
Example:
for (i = 1; i <= n; i++)
for (j = 1; j <= n; j++) O(n2)
sum = sum + i + j;
n2 + + 1 + n + n = O(n2 + 2n + 1) = O(n2)
49
ANALYZING IF EXECUTION
If Statement: Take the complexity of the most expensive case :
char key;
int[][] A = new int[n][n];
int[][] B = new int[n][n];
int[][] C = new int[n][n];
: : :
if(key == '+') {
for(int i = 0; i < n; i++)
for(int j = 0; j < n; j++) Overall
C[i][j] = A[i][j] + B[i][j]; O(n2) complexity
} // End of if block O(n3)
else if(key == 'x')
C = matrixMult(A, B); O(n3)
else
System.out.println("Error! Enter '+' or 'x'!"); O(1)
50
ANALYZING IF EXECUTION- CONT.
if (test) s1 else s2
The running time is never more than the running time of the test plus the larger of the running
times of s1 and s2
if (test == 1) O(1)
Example:
for (i = 1; i <= n; i++)
sum = sum + i; O(n)
else
for (i = 1; i <= n; i++)
for (j = 1; j <= n; j++) O(n2)
sum = sum + i + j;
51
ANALYZING SWITCH EXECUTION
Switch: Take the complexity of the most expensive case
char key;
int[] X = new int[n];
Overall Complexity: O(n2)
int[][] Y = new int[n][n];
........
switch(key) {
case 'a':
for(int i = 0; i < X.length; i++) o(n)
sum += X[i];
break;
case 'b':
for(int i = 0; i < Y.length; j++) o(n2)
for(int j = 0; j < Y[0].length; j++)
sum += Y[i][j];
break;
} // End of switch block
52
ANALYZING METHOD CALLS
The body of a loop may contain a call to a method
To determine the order of the loop body, the order of the method must be taken into
account
The overhead of the method call itself is generally ignored
53
EXAMPLES OF ALGORITHMS AND THEIR BIG-O
COMPLEXITY
Big-O Notation Examples of Algorithms
54
Software must make efficient use of resources such as CPU time
and memory.
Algorithm analysis is a fundamental computer science topic.
A growth function shows time or space utilization relative to the
problem size.
The order of an algorithm is found by eliminating constants and all
but the dominant term in the algorithm’s growth function.
The order of an algorithm provides an upper bound to the
algorithm’s growth function.
SUMMARY If the algorithm is inefficient, a faster processor will not help in the
long run.
Analyzing algorithm complexity often requires analyzing the
execution of loops.
The time complexity of a loop is found by multiplying the
complexity of the body of the loop by how many times the loop
will execute.
The analysis of nested loops must take into account both the inner
and outer loops.
If and Switch statements: take the complexity of the most
expensive case.
If there are function calls, these must be analyzed first.
55
ANY QUESTION???
56
REFERENCES
Java Software Structures, - Designing and Using Data Structures,4th edition, Lewis and Chase
Data Structures and Problem Solving Using Java, 4th edition, Weiss
Data Structures & Algorithms in Java, 3rd edition, Drozdek
Data Structures and Algorithm Analysis in Java, 3rd edition, Weiss
Algorithms, 4th edition, Sedgewick and Wayne
A Practical Introduction to Data Structures and Algorithm Analysis, 3rd edition, Shaffer
Data Structures and Algorithms in Java, 6th edition, Goodrich, Tamassia and Goldwasser
Slides at: http://faculty.kfupm.edu.sa/ICS/jauhar/ics202/
57