You are on page 1of 22

DSAL-210 LECT 1: Understanding

Asymptotic Notation

By
W.G. Gondwe
Lecturer, Computer Networks
CS&IT Department
Malawi University of Science & Technology
Overview
• Introduction
• Mathematical Functions & Growth Rates
• Defining Asymptotic Behaviour
• Useful Asymptotic Notations
1. Big-Oh
2. Big-Omega
3. Big-Theta
• Practice Exercises
Introduction
In this lecture, we take a detour and visit some mathematical concepts that
are required to understand later concepts in the course. In this lecture, we
will introduce the asymptotic notation that will be used later to classify
algorithm complexity.
This is the first of a 2-part series of lectures focusing on mathematical
concepts. The second one will revisit some selected topics from discrete
mathematics.
Background
• Recall: Analysis of algorithms aims to determine computational complexity
in terms of space and time
• In this module, we focus on time complexity of algorithms
• The aim is to learn how to design/choose, analyze, compare and reason
about algorithm
• How can we analyze, compare and reason about algorithm
• Based on their complexity, hence,
• We can benchmark i.e. run algorithms on same computer and compare running times
• However, benchmarking is a naïve approach since actual running times depend on
the prevailing computing environment
• We need a better, more objective way of comparing complexities
Background
• The recommended way is to use functions modelled from
algorithms and compare their asymptotic behaviours
• The asymptotic approach allows us to model the growth rates
of an algorithm as function of input size (N) and inspect limiting
(asymptotic) behaviour (this will become clear as we progress)
• With such an approach, we can objectively compare algorithm
complexities and be able to effectively choose the right one for
a given situation
Mathematical Functions and Growth Rates
• Before proceeding, we need to revisit mathematical functions
and understand their growth rates
• In basic terms, a function is a maping from one set to another
set
• A function takes one or two (independent) variables and
produces a result (dependent variable) e.g. f(x) = x+2
• Functions come in different types. Examples are linear,
quadratic, cubic, exponential, logarithmic etc
• The growth rate of a function is how fast it grows with increased
input values (determined by the highest order term in the
function)
Math functions – growth rates
• For instance a linear function such as
f(x) = 2x + 1 has lower growth rate
than a quadratic function such as f(x) =
x2 + 5
• Generally speaking, the larger the power
of the highest order term, the higher the
growth rate
• On the right is a graphical visualization of
different function growth rates. The
steeper the function, the higher its growth
rate
Math functions - growth rates
• It is also important to note that for functions of the same order or
same growth rate, the effect of lower order terms becomes less
important as the independent variable approaches infinity
• For instance, given two functions f(x) = 2x2 + 5 and g(x)
= 3x2 + 10x + 1, the effects of (+ 5) and (+ 10x +
1) dwindle as the value of x approaches infinity
• This is because for large value of x, the growth rate of the
function is largely dependent of the highest order terms i.e. 2x2
and 3x2
• Mathematicaly, f(x) and g(x) are said to exhibit the same
asymptotic behaviour i.e. they grow similarly as x grows infinitely
Defining Asymptotic Behaviour
• In mathematics, asymptotic behaviour describes the limiting properties of a
function i.e. its behaviour as it approaches infinity
• In other words, asymptotic analysis examines upper or lower bounds of
functions as input values grow infinitely large
• Generally, a function whose highest order is k, is observed to be limited by
a constant multiple of the same order
• For instance, for a cubic function f(x) = x3 + 2x2 - x (where order
k=3). We can show that it can never exceed cx3 for some given constant c,
for infinitely large values of x
• In some cases, it is also possible to show that the f(x) is always larger than
a multiple constant of x3 (lower bound)
• We will examine this behaviour graphically in the next slide
Asymptotic behaviour cont...
Asymptotic analysis for a logarithmic
function (upper bound)
• The diagram shows the upper bound (aka
asymptote) for an algorithmic function f(x)
• In the following slide, we will introduce
formal asymptotic notation for expressing
upper, lower and tight bounds
• Eventually, we will use this notation to
specify algorithm complexities
Useful Asymptotic Notation
• In this course, we will introduce three asymptotic notations for expressing
function limiting behaviour
• These are:
1. Big-Oh or Ο(f(n)) for upper bounds
2. Big-Omega or Ω(f(n)) for lower bounds
3. Big Theta or θ(f(n)) for both upper and lower bounds (tight bound)

(Oh, Omega and Theta are mathematical symbols borrowed from the Greek alphabet)
• We will now introduce each notation and explain its mathematical meaning
Big-Oh - Upper asymptotic bound
• A function f(n) is said to be Big-Oh of another function g(n) if and only if
f(n) is eventually upper-bounded by g(n)
• Mathematically:
f(n) = O(g(n)) if and only if there exists a constant c and a
positive value n0, such that f(n) ≤ c.g(n) for all n > n0

• More formally:

f(n) = O(g(n)) iff ∃ c ∈ such that f(n) ≤ c.g(n) for all n > n0
Big-Oh cont...
The digram below shows that f(n) is eventually
upper-bounded by a constant multiple of g(n)

• Diagramatically, c.g(n)
exceeds f(n) at the point
where n is n0
• Thereafter, c.g(n) is always
larger than f(n), hence it forms
the upper bound
• This is the definition of Big-Oh
Big-Oh cont…
• Generally, we can make the following claim:
Given any polynomial f(n) = aknk + ak-1nk-1 + … + a1n + a0
(where ai are constants, 0 ≤ i ≤ k). Then f(n) = O(nk)

Proof:
Lets choose n0 = 1, then:
f(n) = aknk + ak-1nk-1 + … + a1n + a0
≤ | aknk + ak-1nk-1 + … + a1n + a0 | (absolute value)
≤ | ak|nk + |ak-1|nk + … + |a1|nk + |a0|nk| (replace with largest power)
≤ (|ak|+ |ak-1|+ … + |a1|+ |a0|)nk
If we set our c = |ak| +|ak-1|+ … + |a1|+ |a0|

Then f(n) ≤ c⋅nk for all n ≥1


Proven! (Hint: Find an n0 and a c such that f(n) ≤ c⋅ g(n))
Big-Oh cont...
• From the previous claim, we can generally say the following:
f(n) = 2n2 + n + 5 is O(n2)
f(n) = n3 + 2n2 + 1 is O(n3)
f(n) = 3n4 + 2n - 5 is O(n4)
Etc..
-- Note that this only applies to polynomials (i.e. linear combitations of
constants and variables)
Big-Omega - lower asymptotic bound
• Big-Omega (Ω) is used to express the lower bound for a function
(the opposite of Big-Oh)
• A function f(n) is said to be Big-Omega of another function g(n) if
and only if f(n) is eventually lower-bounded by g(n)
• Mathematically:
f(n) = Ω(g(n)) if and only if there exists a constant c and a
positive value n0, such that f(n) ≥ c.g(n) for all n > n0

• More formally:

f(n) = O(g(n)) iff ∃ c ∈ such that f(n) ≥ c.g(n) for all n > n0
Big-Omega cont...
The digram below shows that f(n) is eventually
lower-bounded by a constant multiple of g(n)

• Diagramatically, f(n) exceeds c.g(n)


at the point where n is n0
• Thereafter, f(n) is always smaller
than c.g(n), hence g(n) forms the
lower bound for f(n)
• This is the definition of Big-Omega
• Note that we can make the same
conclusion about polynomials and
lower bounds (as we did with Big-
oh)
Big-Theta - Tight asymptotic bound
• Finally, lets look at a notation that expresses both upper and lower bound
• Big-theta is used to show that a function is both upper and lower bounded by
another function
• A function f(n) is said to be Big-Theta of another function g(n) if and only if f(n) is
eventually sandwiched between constant multiples of g(n)
• Mathematically:
f(n) = θ(g(n)) if and only if there exists a two constants c1 and c2, and a
positive value n0, such that c1.g(n) ≤ f(n) ≤ c2.g(n) for all n > n0
• More formally:

f(n) = θ(g(n)) iff ∃ c1,c2 ∈ such that


c1.g(n) ≤ f(n) ≤ c2.g(n) for all n > n0
Big-Theta cont...
The digram below shows that f(n) is eventually
sandwiched by a constant multiples of g(n)

• Diagramatically, f(n) exceeds


c2.g(n) and is less than c1.g(n) at
the point where n is n0
• Thereafter, f(n) is always
sandwiched between c1.g(n) and
c2.g(n)
• This is the definition of Big-Omega
Notations conclusion
• Going forward, we will learn how to use these notations to reason about
algorithm complexity and why we use asymptotic means
• The next two slides presents three exercises that you should attempt
before the next lecture
Practice Exercises
Brain Teaser!
Exercises cont…
1. Given f(n) = 2n+5, show that f(n) is O(2n). Hint: find two
constants c and n0 such that 2n+5 ≤ c⋅2n, for all n ≥ n0

2. Show that 5n2 is Ω(n)

You might also like