You are on page 1of 43

Data Structures and Algorithms

Lecture 5 : Recursion – Simple Sorting Techniques

1 Data Structures and Algorithms lecture 5


Recursion Outline
 Recursion
 What is Recursion.
 Types of Recursion
 Linear recursion
 Ex.1: Triangular numbers
 Ex.2: Factorials.
 Ex.3: Summing the Elements of an Array Recursively
 Ex.4: Reversing a Sequence with Recursion
 Ex.5: Binary search

 Binary Recursion
 Ex.1 : Summing the elements of a sequence using binary recursion

 Mutual Recursion

2 Data Structures and Algorithms lecture 5


What is a recursive method

 A method that calls itself


 With each method call the problem becomes
simpler
 Must have a condition that leads to a base case in
which the method no longer making another
method call on itself

3 Data Structures and Algorithms lecture 5


Types of Recursion
 Linear Recursion: If a recursive call starts at most one
other, we call this a linear recursion
 Binary Recursion: If a recursive call may start two others,
we call this a binary recursion.
 Multiple Recursion : If a recursive call may start three or
more others, this is multiple recursion
 Mutual Recursion: occurs when two routines call each
other.

4 Data Structures and Algorithms lecture 5


Linear Recursion
1- Triangular numbers

 In Triangular numbers {1,3,6,10,15,21,……} The nth term


in the series is obtained by adding n to the previous
term.

5 Data Structures and Algorithms lecture 5


Finding the nth Term Using a Loop:

int triangle(int n)
{
int total = 0;
while(n > 0) // until n is 1
{
total = total + n; // add n (column height) to total
--n; // decrement column height
}
return total;
}

6 Data Structures and Algorithms lecture 5


Finding the nth Term Using
Recursion:

int triangle(int n)
{
if(n==1)
return 1; //The condition that leads to a
else recursive method returning without
making another
return( n + triangle(n-1) ); recursive call is referred to as the
} base case. It’s critical that every
recursive method have a base case
to prevent infinite recursion

7 Data Structures and Algorithms lecture 5


int triangle(int n)
{
if(n==1)
return 1;
else
return( n + triangle(n-1) );
}

8 Data Structures and Algorithms lecture 5


Is Recursion Efficient?
 Calling a method involves certain overhead. Control must be
transferred from the location of the call to the beginning of the
method. In addition, the arguments to the method and the
address to which the method should return must be pushed
onto an internal stack so that the method can access the
argument values and know where to return.
 Another inefficiency is that memory is used to store all the
intermediate arguments and return values on the system’s
internal stack. This may cause problems if there is a large
amount of data, leading to stack overflow.
 Recursion is usually used because it simplifies a problem
conceptually, not because it’s inherently more efficient.

9 Data Structures and Algorithms lecture 5


Linear Recursion
2- Factorials

int factorial(int n)
{
if(n==0)
return 1;
else
return (n * factorial(n-1) );
}

10 Data Structures and Algorithms lecture 5


int factorial(int n)
{
if(n==0)
return 1;
else
return (n * factorial(n-1) );
}

11 Data Structures and Algorithms lecture 5


Linear Recursion
3- Summing the Elements of an Array Recursively

int linearSum(int[ ] data, int n)


{
if (n == 0)
return 0;
else
return ( linearSum(data, n−1) + data[n−1]);
}

12 Data Structures and Algorithms lecture 5


int linearSum(int[ ] data, int n)
{
if (n == 0)
return 0;
else
return linearSum(data, n−1) + data[n−1];
13 Data Structures and Algorithms lecture 5
Linear Recursion
4- Reversing a Sequence with Recursion

void reversearray( int[] data, int low, int high ) {


if(low<high) {
int temp=data[low];
data[low]=data[high];
data[hight]=temp;
reversearray(data, low+1, high-1);
}}
14 Data Structures and Algorithms lecture 5
Linear Recursion
5- Binary Search
 binary search, used to efficiently locate a target value
within a sorted sequence of n elements stored in an array.
This is among the most important of computer
algorithms, and it is the reason that we so often store
data in sorted order

15 Data Structures and Algorithms lecture 5


16 Data Structures and Algorithms lecture 5
Linear Search: O(n)
Binary Search: O(log n)

17 Data Structures and Algorithms lecture 5


18 Data Structures and Algorithms lecture 5
Binary Recursion
Summing the elements of a sequence using binary
recursion
 summing the n integers of an array. Recursively compute
the sum of the first half, and the sum of the second half,
and add those sums together

19 Data Structures and Algorithms lecture 5


 To analyze algorithm binarySum, we consider, for simplicity,
the case where n is a power of two. Figure 5.13 shows the
recursion trace of an execution of binarySum(data, 0, 7).
We label each box with the values of parameters low and
high for that call. The size of the range is divided in half at
each recursive call,

20 Data Structures and Algorithms lecture 5


Mutual Recursion
• Mutual recursion is a variation recursion. Two functions are
called mutually recursive if the first function makes a recursive
call to the second function and the second function, in turn, calls
the first one.
• In software development this concept is used in circular
dependency which is a relation between two or more modules
which either directly or indirectly depend on each other to
function properly. Such modules are also known as mutually
recursive.
• A great example of mutual recursion would be implementing the
Hofstadter Sequence. https://www.geeksforgeeks.org/mutual-
recursion-example-hofstadter-female-male-sequences/

21 Data Structures and Algorithms lecture 5
Simple sorting techniques - Outline
 Simple Sorting techniques
1. Bubble sort.
2. Selection sort.
3. Insertion sort.
 Stability.
 Comparing simple sorting techniques

22 Data Structures and Algorithms lecture 5


Bubble Sort

23 Data Structures and Algorithms lecture 5


24 Data Structures and Algorithms lecture 5
Example:

25 Data Structures and Algorithms lecture 5


26 Data Structures and Algorithms lecture 5
void bubbleSort()
{
int out, in;
for(out=nElems-1; out>1; out--) // outer loop (backward)
for(in=0; in<out; in++) // inner loop (forward)
if( a[in] > a[in+1] ) // out of order?
swap(in, in+1); // swap them
} // end bubbleSort()
//--------------------------------------------------------------
void swap(int one, int two)
{
long temp = a[one];
a[one] = a[two];
a[two] = temp;
}

27 Data Structures and Algorithms lecture 5


Efficiency of the Bubble Sort

 If N is the number of items in the array, there are N-1


comparisons on the first pass, N-2 on the second, and so
on. The formula for the sum of such a series is:
(N–1) + (N–2) + (N–3) + ... + 1 = N*(N–1)/2
 Thus, the algorithm makes about N2⁄2 comparisons .
 There are fewer swaps than there are comparisons
because two bars are swapped only if they need to be. If
the data is random, a swap is necessary about half the
time, so there will be about N2⁄4 swaps.
 Both swaps and comparisons are proportional to N2.
Because constants don’t count in Big O notation, we can
ignore the 2 and 4 and say that the bubble sort runs in
O(N2) time.
28 Data Structures and Algorithms lecture 5
Selection Sort

29 Data Structures and Algorithms lecture 5


30 Data Structures and Algorithms lecture 5
31 Data Structures and Algorithms lecture 5
32 Data Structures and Algorithms lecture 5
void selectionSort()
{
int out, in, min;
for(out=0; out<nElems-1; out++) // outer loop
{
min = out; // minimum
for(in=out+1; in<nElems; in++) // inner loop
if(a[in] < a[min] ) // if min greater,
min = in; // we have a new min
swap(out, min); // swap them
} // end for(out)
} // end selectionSort()
//--------------------------------------------------------------
void swap(int one, int two)
{
long temp = a[one];
a[one] = a[two];
a[two] = temp;
}
33 Data Structures and Algorithms lecture 5
Efficiency of the Selection Sort

 The selection sort improves on the bubble sort by reducing


the number of swaps necessary from O(N2) to O(N).
 Unfortunately, the number of comparisons remains O(N2).
 For large values of N, the comparison times will dominate, so
we would have to say that the selection sort runs in O(N2)
time, just as the bubble sort did.
 However, the selection sort can still offer a significant
improvement for large records that must be physically moved
around in memory, causing the swap time to be much more
important than the comparison time.

34 Data Structures and Algorithms lecture 5


Insertion Sort

35 Data Structures and Algorithms lecture 5


Insertion Sort
 In this technique we pick an element an then insert it at the
appropriate position in ascending or descending order.
 In pass 1, element a[1] is inserted before or after a[0], so that
a[0] and a[1] are sorted.
 In pass 2, a[2] is inserted either before a[0] or between a[0]
and a[1] or after a[1] so that, a[0], a[1] and a[2] are sorted .
 In a similar way the process is carried out n-1 times.
 Note that partial sorting did not take place in the bubble sort
and selection sort. In these algorithms a group of data items
was completely sorted at any given time; in the insertion sort
a group of items is only partially sorted.

36 Data Structures and Algorithms lecture 5


37 Data Structures and Algorithms lecture 5
38 Data Structures and Algorithms lecture 5
void insertionSort()
{
int in, out;
for(out=1; out<nElems; out++) // out is dividing line
{
long temp = a[out]; // remove marked item
in = out; // start shifts at out
while(in>0 && a[in-1] > temp) // until one is smaller,
{
a[in] = a[in-1]; // shift item right,
--in; // go left one position
}
a[in] = temp; // insert marked item
} // end for
} // end insertionSort()

39 Data Structures and Algorithms lecture 5


Efficiency of the Insertion Sort
 On the first pass, it compares a maximum of one item.
On the second pass, it’s a maximum of two items, and so
on, up to a maximum of N-1 comparisons on the last
pass. This is
1 + 2 + 3 + … + N-1 = N*(N-1)/2
 The number of copies is approximately the same as the
number of comparisons. However, a copy isn’t as time-
consuming as a swap,.
 so for random data this algorithm still executes in O(N2)
time, but it’s about twice as fast as the bubble sort and
somewhat faster than the selection sort if the array is
partially sorted.

40 Data Structures and Algorithms lecture 5


Stability
 A sorting algorithm is said to be stable if two objects
with equal keys appear in the same order in sorted
output as they appear in the input array to be sorted.
Some sorting algorithms are stable by nature like
Insertion sort, selection sort, Merge Sort, Bubble Sort,
etc. And some sorting algorithms are not, like Heap Sort,
Quick Sort, etc.
 The three simple sorting algorithms Bubble sort,
selection sort, and insertion sort are all stable.

41 Data Structures and Algorithms lecture 5


Comparing the Simple Sorts
 Simplicity:
 The bubble sort is so simple. Even so, it’s practical only if the amount of data is
small.
 The selection sort minimizes the number of swaps, but the number of
comparisons is still high. This sort might be useful when the amount of data is
small and swapping data items is very time-consuming compared with comparing
them.
 The insertion sort is the most versatile of the three and is the best bet in most
situations, assuming the amount of data is small or the data is almost sorted.
 Memory requirements:
 All three simple sort algorithms (bubble, selection, and insertion) carry out their
sort in place, meaning that, besides the initial array, very little extra memory is
required.
 All the sorts require an extra variable to store an item temporarily while it’s being
swapped.
 Stability:
 Also, all the three simple sort algorithms are stable.

42 Data Structures and Algorithms lecture 5


Technique
Bubble Selection Insertion
Comparison
# comparisons N2/2 N2 N2
# swaps N2/4 N Copies N2
Complexity O(n2) O(n2) O(n2)
Stability Stable Stable Stable
One extra One extra
Memory One extra temporary
temporary temporary
requirements variable
variable variable

43 Data Structures and Algorithms lecture 5

You might also like