You are on page 1of 6

DEF: Algorithm is a step-by-step procedure, which defines a set of instructions to be executed in

a certain order to get the desired output.

Characteristics of an Algorithm

Not all procedures can be called an algorithm. An algorithm should have the following
characteristics −
 Unambiguous − Algorithm should be clear and unambiguous. Each of its steps (or
phases), and their inputs/outputs should be clear and must lead to only one meaning.
 Input − An algorithm should have 0 or more well-defined inputs.
 Output − An algorithm should have 1 or more well-defined outputs, and should match
the desired output.
 Finiteness − Algorithms must terminate after a finite number of steps.
 Feasibility − Should be feasible with the available resources.
 Independent − An algorithm should have step-by-step directions, which should be
independent of any programming code.
Analysis of algorithm
In theoretical analysis of algorithms, it is common to estimate their complexity in the
asymptotic sense, i.e., to estimate the complexity function for arbitrarily large input. The
term "analysis of algorithms" was coined by Donald Knuth.
 Algorithm analysis is an important part of computational complexity theory, which
provides theoretical estimation for the required resources of an algorithm to solve a
specific computational problem. Most algorithms are designed to work with inputs of
arbitrary length. Analysis of algorithms is the determination of the amount of time and
space resources required to execute it.
 Usually, the efficiency or running time of an algorithm is stated as a function relating the
input length to the number of steps, known as time complexity, or volume of memory,
known as space complexity
Analysis of algorithm is the process of analyzing the problem-solving capability of the
algorithm in terms of the time and size required (the size of memory for storage while
implementation). However, the main concern of analysis of algorithms is the required time or
performance. Generally, we perform the following types of analysis −
 Worst-case − The maximum number of steps taken on any instance of size a.
 Best-case − The minimum number of steps taken on any instance of size a.
 Average case − An average number of steps taken on any instance of size a.
 Amortized − A sequence of operations applied to the input of size a averaged over time
Asymptotic analysis of complexity bounds - best, average and worst-case behavior:

Asymptotic analysis of an algorithm refers to defining the mathematical boundation/framing of


its run-time performance. Using asymptotic analysis, we can very well conclude the best case,
average case, and worst case scenario of an algorithm.
Asymptotic analysis is input bound i.e., if there's no input to the algorithm, it is concluded to
work in a constant time. Other than the "input" all other factors are considered constant.
Asymptotic analysis refers to computing the running time of any operation in mathematical
units of computation. For example, the running time of one operation is computed as f(n) and
may be for another operation it is computed as g(n2). This means the first operation running
time will increase linearly with the increase in n and the running time of the second operation
will increase exponentially when n increases. Similarly, the running time of both operations will
be nearly the same if n is significantly small.
Usually, the time required by an algorithm falls under three types −
 Best Case − Minimum time required for program execution.
 Average Case − Average time required for program execution.
 Worst Case − Maximum time required for program execution.
Performance measurements of Algorithm:

By measuring performance of an algorithm we can determine which algorithm is better than the
other one. Performance of an algorithm is usually represented by the Big O Notation.

O(n) — Linear Time : Approach: We will have to look at each and every file present in the
cabinet till we can find our file.
O(1) — Constant Time : Approach: Open the cabinet, pick the file from the location. End of
the story.
O(n²) — Quadratic Time :
Approach: We will have to pick the first file, check it against all the other files. Then take file
number 2 and repeat the same action until you got to the last file.

O(log n) — Logarithmic time :


Approach: We can start in the middle, see in which half should be in, go to the middle of the half
and repeat the steps until we found our file

Time and space trade-offs:


Substitution method, Recursion tree method and Masters’ theorem:

When we analyze them, we get a recurrence relation for time complexity. We get running time
on an input of size n as a function of n and the running time on inputs of smaller sizes. For
example in Merge Sort, to sort a given array, we divide it in two halves and recursively repeat
the process for the two halves. Finally we merge the results. Time complexity of Merge Sort can
be written as T(n) = 2T(n/2) + cn. There are many other algorithms like Binary Search, Tower of
Hanoi, etc.
A) Substitution Method: We make a guess for the solution and then we use mathematical
induction to prove the guess is correct or incorrect.

2) Recurrence Tree Method: In this method, we draw a recurrence tree and calculate the time
taken by every level of tree. Finally, we sum the work done at all levels. To draw the recurrence
tree, we start from the given recurrence and keep drawing till we find a pattern among levels.
The pattern is typically a arithmetic or geometric series.
3) Master Method:
Master Method is a direct way to get the solution. The master method works only for following
type of recurrences or for recurrences that can be transformed to following type.
T(n) = aT(n/b) + f(n) where a >= 1 and b > 1
There are following three cases:
1. If f(n) = Θ(nc) where c < Logba then T(n) = Θ(nLogba)
2. If f(n) = Θ(nc) where c = Logba then T(n) = Θ(ncLog n)
3.If f(n) = Θ(nc) where c > Logba then T(n) = Θ(f(n))

How does this work?


Master method is mainly derived from recurrence tree method. If we draw recurrence tree of
T(n) = aT(n/b) + f(n), we can see that the work done at root is f(n) and work done at all leaves is
Θ(nc) where c is Logba. And the height of recurrence tree is Logbn

You might also like