You are on page 1of 23

Advanced Analysis of Algorithms

Lecture # 2
MS(Artificial Intelligence)
Semester 1
Advance Analysis of Algorithms

Ultimate goal of Algorithm Design is to


complete tasks efficiently.

What does “efficiently” means ?

Efficiency depends upon Running Time.


Running Time of
Algorithm
What effects running time of an algorithm?
• computer used, the hardware platform (Single Processor / Multi-
Processor , Computer Generation , 32bit / 64 bit etc.)
• Memory Read / Write Speed and Type (Sequential ,
Random, Indexed)
• Compiler / Interpreter efficiency.
• Programming Skills
• Complexity of underlying algorithm
• Size of the input

Most Important factors are

• Complexity of underlying algorithm


• Size of the input
Complexity of Algorithm
• Complexity of an algorithm is a measure of the
amount of time and/or space required by an
algorithm for an input of a given size (n).

• Analysis of Algorithm depends on

– How good is the algorithm?


• Correctness
• Time Efficiency
• Space Efficiency

– Does there exist a better algorithm?


• Lower bounds
• Optimality
• Algorithm Design Techniques

• Brute force
• Divide and conquer
• Decrease and conquer
• Transform and conquer
• Greedy approach
• Dynamic programming
• Backtracking and branch-and-bound
• Space and time tradeoffs
Time Complexcity
Time Complexcity
Time Complexcity
Time Complexcity
Time Complexcity
Asymptotic Notations Properties

• Categorize algorithms based on asymptotic growth rate


– e.g. linear, quadratic, exponential

• Ignore small constant and small inputs

• Estimate upper bound and lower bound on growth rate of


time complexity function

• Describe running time of algorithm as n grows to .

Limitations
• not always useful for analysis on fixed-size inputs.
• All results are for sufficiently large inputs.
Time Complexity of Algorithm
• Rate of Growth
– Rate of growth is the rate at which running time
increases as a function of input.
• Lower Order Term
– When given an approximation of the rate of
growth of a function, we tends to drop the lower
order terms as they are less significant to higher
order terms.
e.g f(n) = n2 + n
Lower order term n will be dropped and
O(n2)

– Here what is O (Oh)?


Time Complexity of Algorithm
• We can analyze algorithms in one of 3 ways
– Worst Case
• Longest Time
• Represented by O (Big- Oh)
– Best Case
• Least Time
• Represented by Ω (Big – Omega)
– Average Case
• Average Time
• Represented by Φ(Big – Theta)

• These are known as Asymptotic Notations


Asymptotic notation of an algorithm is a
mathematical representation of its complexity
Time Complexity of Algorithm
• Big - Oh Notation (O)
• Big - Oh notation is used to define the upper
bound of an algorithm in terms of Time
Complexity.
• Worst Case Big ‘
- Omege Notation (Ω)
• Big - Omega notation is used to define the
lower bound of an algorithm in terms of Time
Complexity.
• Best Case Big
- Theta Notation (Θ)
• Big - Theta notation is used to define the
average bound of an algorithm in terms of Time
Complexity.
Time Complexity of Algorithm

The idea is to establish a


relative order among
functions for large

f(n) grows no faster


than g(n) for “large” n
Time Complexity of Algorithm
• Asymptotic notation:

Big-Oh f(n) = O(g(n))

There are positive constants c and


n0 such that

The growth rate of f(n) is less than or equal to the growth rate of g(n),
so g(n) is called an upper bound on f(n)
Thanx for Listening

You might also like