Professional Documents
Culture Documents
Characteristics of an Algorithm
Not all procedures can be called an algorithm. An algorithm should have the following
characteristics −
Unambiguous − Algorithm should be clear and unambiguous. Each of its steps (or
phases), and their inputs/outputs should be clear and must lead to only one meaning.
Input − An algorithm should have 0 or more well-defined inputs.
Output − An algorithm should have 1 or more well-defined outputs, and should match
the desired output.
Finiteness − Algorithms must terminate after a finite number of steps.
Feasibility − Should be feasible with the available resources.
Independent − An algorithm should have step-by-step directions, which should be
independent of any programming code.
Analysis of algorithm
In theoretical analysis of algorithms, it is common to estimate their complexity in the
asymptotic sense, i.e., to estimate the complexity function for arbitrarily large input. The
term "analysis of algorithms" was coined by Donald Knuth.
Algorithm analysis is an important part of computational complexity theory, which
provides theoretical estimation for the required resources of an algorithm to solve a
specific computational problem. Most algorithms are designed to work with inputs of
arbitrary length. Analysis of algorithms is the determination of the amount of time and
space resources required to execute it.
Usually, the efficiency or running time of an algorithm is stated as a function relating the
input length to the number of steps, known as time complexity, or volume of memory,
known as space complexity
Analysis of algorithm is the process of analyzing the problem-solving capability of the
algorithm in terms of the time and size required (the size of memory for storage while
implementation). However, the main concern of analysis of algorithms is the required time or
performance. Generally, we perform the following types of analysis −
Worst-case − The maximum number of steps taken on any instance of size a.
Best-case − The minimum number of steps taken on any instance of size a.
Average case − An average number of steps taken on any instance of size a.
Amortized − A sequence of operations applied to the input of size a averaged over time
Asymptotic analysis of complexity bounds - best, average and worst-case behavior:
By measuring performance of an algorithm we can determine which algorithm is better than the
other one. Performance of an algorithm is usually represented by the Big O Notation.
O(n) — Linear Time : Approach: We will have to look at each and every file present in the
cabinet till we can find our file.
O(1) — Constant Time : Approach: Open the cabinet, pick the file from the location. End of
the story.
O(n²) — Quadratic Time :
Approach: We will have to pick the first file, check it against all the other files. Then take file
number 2 and repeat the same action until you got to the last file.
When we analyze them, we get a recurrence relation for time complexity. We get running time
on an input of size n as a function of n and the running time on inputs of smaller sizes. For
example in Merge Sort, to sort a given array, we divide it in two halves and recursively repeat
the process for the two halves. Finally we merge the results. Time complexity of Merge Sort can
be written as T(n) = 2T(n/2) + cn. There are many other algorithms like Binary Search, Tower of
Hanoi, etc.
A) Substitution Method: We make a guess for the solution and then we use mathematical
induction to prove the guess is correct or incorrect.
2) Recurrence Tree Method: In this method, we draw a recurrence tree and calculate the time
taken by every level of tree. Finally, we sum the work done at all levels. To draw the recurrence
tree, we start from the given recurrence and keep drawing till we find a pattern among levels.
The pattern is typically a arithmetic or geometric series.
3) Master Method:
Master Method is a direct way to get the solution. The master method works only for following
type of recurrences or for recurrences that can be transformed to following type.
T(n) = aT(n/b) + f(n) where a >= 1 and b > 1
There are following three cases:
1. If f(n) = Θ(nc) where c < Logba then T(n) = Θ(nLogba)
2. If f(n) = Θ(nc) where c = Logba then T(n) = Θ(ncLog n)
3.If f(n) = Θ(nc) where c > Logba then T(n) = Θ(f(n))