You are on page 1of 28

ALGORITHM & COMPLEXITIES

Algorithm Design Techniques

Name: Dr. Salama A Mostafa


Address: FSKTM, 4th floor, Room No. 10
Email: salama@uthm.edu.my
Course Code: BIE 20303
Course Coordinator: Prof Dr. Mustafa Mat Deris
Algorithm Design Techniques
• Well known algorithm design techniques
• Brute Force
• Greedy
• Divide and Conquer
• Dynamic Programming
Algorithm Design Techniques
• Brute Force Technique
• It is a straight forward and simple technique to solve problems
• It covers a wide range of problems
• It is costly
• Related Algorithms
• Selection sort
• Bubble sort
• Linear search
Algorithm Design Techniques
• Brute Force Technique
• Sorting algorithms
• All sorting algorithms start with the given input array of data [unsorted array] and after
each iteration extend the sorted part of the array by one cell. The sorting algorithms are
terminated when sorted part of the array equals to the size of the array.
• Selection Sort
• The basic idea is to find the smallest number in the array and swap it with the leftmost
cell of the unsorted array, thereby increasing the sorted part of the array by one more cell.
• Step 1: Input the array to be sorted
• Step 2: Find the smallest element in the array
• Step 3: Swap the smallest element with the first element in the array
• Step 4: Repeat step 1 & 2 for the remaining elements
• Step 5: Stop
Algorithm Design Techniques
• Brute Force Technique
• Selection Sort
• To sort the given array a[1…..n] in ascending order
1. begin
2. For i=1 to n-1 do
2.1 set min =i
2.2 For j = i+1 to n do
2.2.1 if (a[j] < a[min] then set min =j
2.3 if (i≠ min) then swap a[i] and a[min]
3. end
Algorithm Design Techniques
• Selection Sort
• Analysis
• Number of times the inner loop will get executed = n-(i+1)+1
• Number of basic operations done inside the outer loop = number of basic operations done
inside it (inner ‘for loop’ + ‘if’ statement)
• In the worst case, the if statement will do 1 swap operation, every time
•  number of basic operations done inside the inner loop = [n-(i+1) +1] + 1 = n-i+1
• Big-oh
• Te total number of operations done by the inner loop is
(n-1+1) + (n-2+1) + ……+(n-(n-1)+1)
= n + (n-1) + (n-2)+ …..+2
= n(n+1)/2 – 1 = O()
Algorithm Design Techniques
• Brute Force Technique
• Bubble Sort
• It is one of the simplest sorting techniques
• Usually, the bubble sort is used when the problem size is small
• The basic idea is to compare adjacent elements of an array and exchanges them if they are not in
order. After each iteration (pass) the largest element bubbles up to the last position of the array. In
the next iteration the second largest element bubbles up to the second last position and so on.
• Step 1: Input the array to the sorted
• Step 2: Starting with the first element(index = 0)
• Step 3: Compare the current element with the next element of the array
• Step 4: If the current element is greater than the next element, swap them;
If the current element is less than the next element, move to the next element
• Step 5: Repeat step 3 & 4 for the remaining elements
• Step 6: Stop
Algorithm Design Techniques
• Brute Force Technique
• Bubble Sort
• To sort the given array a[1…..n] in ascending order
1. begin
2. For i=1 to n-1 do
2.1 For j = 1 to i do
2.1.1 if (a[j] > a[j+1] then swap a[j] and a[j+1]
3. end Bubble Sort
Algorithm Design Techniques
• Bubble Sort
• Analysis
• The basic operations done inside the inner loop are one comparison and one swapping
• Number of times the basic operations done inside the internal loop will be executed = (i-
1) + 1 = I
• Big-oh
• The total number of operations done by the outer loop is
= 1 + 2 + …..+(n-1)
= n(n-1)/2 = O ()
Algorithm Design Techniques
• Brute Force Technique
• Search Algorithm
• It is any algorithm which solves the search problem, namely, to retrieve information
stored within some data structure, or calculated in the search space of a problem domain,
either with discrete or continuous values
• Searching algorithms are designed to check for an element or retrieve an element from
any data structure where it is stored
• There are different searching techniques like
• Linear search
Algorithm Design Techniques
• Brute Force Technique
• Linear search
• Also called as a sequential search as it compares the successive elements of the given set
with the search key
• To find which element (if any) of the given array a[1…..n] equals the target element:
1. begin
2. For i=1 to n do
2.1 if (target = a[i]) then End with output as i
3. else end with output as none
Algorithm Design Techniques
• Linear Search
• Analysis
• The basic operations here is comparison
• Worst case complexity
• Step 2.1 performs 1 comparison and this steps execute n times. So, the worst case is O(n)
• Average case complexity
• Assume that every element in the given array is equally likely to be the target (p1=p2=p3=…..=pn = 1/n). If the
element to be search is at position 1 then number of searching operation to be done is 1(n1). Similarly, if it is at
position 2 then the number of search operations to be done is 2(n2)
• Thus, calculating the average (mean or expected):

∑ nipn = 1.1/n + 2.1/n +…..+n.1/n


= (1/n) + (2/n) +…..+(n/n) = (1+2+…..+n)/n
= [n(n+1)/2] . 1/n
= (n+1)/2 = O(n)
Algorithm Design Techniques
• Brute Force Technique
• Greedy algorithm
• It is primarily used in optimization problems
• In optimization problems we try to find the best of all possible solutions [either maximum or
minimum values]
• The greedy algorithm helps in construction a solution for a problem through a sequence
of steps where each step is considered to be a partial solution. This partial solution is
extended progressively to get the complete solution.
• It does this by selecting the largest available number at each step
• The choice of each step in a greedy algorithm is done based on the following:
• It must be feasible
• It must be locally optimal
• It must be irrevocable
Algorithm Design Techniques: Tutorial
• Greedy algorithm
• Example 1: An Activity Selection Problem
• Is a slight variant of the problem of scheduling a resource among several competing
activities.
• Suppose that we have a set S ={1,2,…..,n} of n events that wish to use an auditorium
which can be used by only one event at a time. Each event i has a start time si and a finish
time fi where si≤fi. An event i if selected can be executed anytime on or after si and must
necessarily end before fi. Two events i and j are said to be compatible if they do not
overlap (meaning si≥fj or sj ≥ fi). The activity selection problem is to select a maximum
subset of compatible activities.
Algorithm Design Techniques: Tutorial
• Greedy algorithm
• Solution: An Activity Selection Problem
• To find the maximum subset of mutually compatible activities from a given set of n activities. The
given set of activities are represented as two arrays s and f denoting the set of all start time and set
of all finish time respectively
1. begin
2. Set A = {1}, j = 1
3. For i=2 to n do
2.1 if (si≥fj) then A = AU {i}, j=I
4. End with output as A
• The activity selection problem works under the assumption that the input array f is sorted in the
increasing order of the finishing time. The greedy approach is highlighted in step 2.1 when we select
the next activity such that the start time of the activity is greater than or equal to the finish time
of the preceding activity in the array A. We don’t need to compare the start time of the next activity
with all the elements of the array A. This is so because the finishing time f is in increasing order
Algorithm Design Techniques: Tutorial
• Greedy algorithm
• Analysis
• To find the maximum subset of mutually compatible activities from a given set of n
activities. The given set of activities are represented as two arrays s and f denoting the set of
all start time and set of all finish time respectively.
1. begin
2. Set A = {1}, j = 1
3. For i=2 to n do
2.1 if (si≥fj) then A = AU {i}, j=I
4. End with output as A
• The analysis of the activity selection problem is given below:
• Worst case complexity : The basic operation here is comparison. Step 3.1 performs 1 comparison
and this step is repeated (n-1) times. So the worst case complexity of this algorithm is 1 * (n-1) =
O(n)
Algorithm Design Techniques
• Greedy algorithm
• Example 2: An Activity Selection Problem
• Let 11 activities are given S = {p, q, r, s, t, u, v, w, x, y, z} start and finished times for proposed activities are (1, 4), (3,
5), (2, 6), (5, 7), (3, 8), 5, 9), (6, 10), (8, 11), (8, 12), (2, 13) and (12, 14).
greedy-activity-selector (s, f)
n = length [s]; 1. Begin
A={i} ; 2. Set A = {1}, j = 1
j = 1;
3. For i=2 to n do
for  i = 2  to  n do
        if   si ≥ fj  then  A= AU{i} , j = i
3.1 if (si≥fj) then A = AU {i},
j=I
return  set A
• A = {p} Initialization at line 2 4. End with output as A
A = {p, s} line 6 - 1st iteration of for - loop
A = {p, s, w} line 6 -2nd iteration of for - loop
A = {p, s, w, z} line 6 - 3rd iteration of for-loop
out of the for-loop and return A = {p, s, w, z}
Algorithm Design Techniques
• Greedy algorithm
• Example 3: An Activity Selection Problem
• Make change for n units using the least possible number of coins
• Make a change for 2.85 (285 cents) here n = 2.85 and the solution contains 2 ringgit, 1 fifty cents , 1 twenty cents,
1 ten cents and 1 five cents. The algorithm is greedy because at every stage it chooses the largest coin without
worrying about the consequences. Moreover, it never changes its mind in the sense that once a coin has been
included in the solution set, it remains there.
make-change (n)
        C ← {100, 50,20, 10, 5}     // constant.
        sol ← {};                         // set that will hold the solution set.
        sum ← 0 sum of item in solution set
        while sum not = n
            x = largest item in set C such that sum + x ≤ n
            if no such item then
                return    "No Solution"
sum ← sum + x
return sum
Algorithm Design Techniques
• Divide and Conquer Technique
• It is an algorithm design paradigm based on multi-branched recursion
• It break down a problem into two or more sub-problems of the same or related
type, until these become simple enough to be solved
• It combined the sub-solutions then give a solution to the original problem
• It produces efficient algorithms
• Related Algorithms
• Quicksort
• Merge sort
• Binary search
Algorithm Design Techniques
• Divide and Conquer Technique
• Quicksort
• It picks an element as pivot and partitions the given array around the picked pivot
• There are many different versions of QuickSort that pick pivot in different ways
• Always pick first element as pivot
• Always pick last element as pivot (presented in the following)
• Pick a random element as pivot
• Pick median as pivot
Algorithm Design Techniques
• Quicksort
• The key process in quickSort is partition(). Target of partitions is, given an array and
an element x of array as pivot, put x at its correct position in sorted array and put all
smaller elements (smaller than x) before x, and put all greater elements (greater than x)
after x. All this should be done in linear time partition(arr[], low, high){
• To sort the given array a[1…..n] in ascending order pivot = arr[high]; // pivot (Element to be placed)
quickSort(arr[], low, high){ i = low - 1 // Index of smaller element
if (low < high){ for (j = low; j <= high- 1; j++){
/* pi is partitioning index, arr[pi] is now at right place */ if (arr[j] < pivot){//If current element is
pi = partition(arr, low, high); smaller
quickSort(arr, low, pi - 1); // Before pivot
quickSort(arr, pi + 1, high); // After pivot
i++; // increment index of smaller element
}} swap arr[i] and arr[j] }}
swap arr[i + 1] and arr[high]
return (i + 1)}
Algorithm Design Techniques
• Quicksort
• Analysis
• Worst Case: The worst case occurs when the partition process always picks greatest or
smallest element as pivot. If we consider above partition strategy where last element is
always picked as pivot, the worst case would occur when the array is already sorted in
increasing or decreasing order. Following is recurrence for worst case.
f(n)= (n-1) + (n-2) + …..+ (n-n) = n+(n-1)+(n-2)+...
= n(n-1)/2 = O ()
• Best Case: The best case occurs when the partition process always picks the middle
element as pivot which results in(O(n Log n)).
• Average Case: We can get an idea of average case by considering the case when partition
puts O(n/9) elements in one set and O(9n/10) elements in other set. Following is
recurrence for this case (O(n log n)).
Algorithm Design Techniques
• Divide and Conquer Technique
• Merge Sort
• It divides input array in two halves, calls itself for the two halves and then merges the two sorted halves
• The merge(arr, l, m, r) is key process that assumes that arr[l..m] and arr[m+1..r] are sorted and merges
the two sorted sub-arrays into one
• To sort the given array a[1…..n] in ascending order
MergeSort(arr[], l, r)
If r > l
1. Find the middle point to divide the array into two halves:
middle m = (l+r)/2
2. Call mergeSort for first half:
Call mergeSort(arr, l, m)
3. Call mergeSort for second half:
Call mergeSort(arr, m+1, r)
4. Merge the two halves sorted in step 2 and 3:
Call mergeSort(arr, l, m, r)
Algorithm Design Techniques
• Merge Sort
• Analysis
• Worst Case: = O(n log n)
• Best Case: O(n Log n)
• Average Case: O(n log n)
Algorithm Design Techniques
• Divide and Conquer Technique
• Binary search
• Search a sorted array by repeatedly dividing the search interval in half.
• If the value of the search key is less than the item in the middle of the interval, narrow the
interval to the lower half. Otherwise narrow it to the upper half.
• Repeatedly check until the value is found or the interval is empty.
• To find which element (if any) of the given array a[1…..n] equals the target element:
• Compare x with the middle element.
• If x matches with middle element, we return the mid index.
• Else If x is greater than the mid element, then x can only lie
in right half subarray after the mid element. So we recur for right half
• Else (x is smaller) return for the left half
Algorithm Design Techniques
• Binary search
• Analysis
• The basic operations here is comparison
• The time complexity of Binary Search can be written as T(n) = T(n/2) + c
• Best case complexity of O(1)
• Average performance O(log n)
• Worst case and (average case) complexity of O(log n)
Algorithm Design Techniques
• Sort Algorithms
• Complexity
ALGORITHM & COMPLEXITIES

Greedy clustering algorithm

Thank You

You might also like