You are on page 1of 88

Algorithms: An Introduction

‘Algorithm’ is a distortion of Al-Khawarizmi,


a Persian mathematician

Section 3.1 of Rosen


Outline
• Introduction & definition
• Algorithms categories & types
• Pseudo-code
• Designing an algorithm
– Example: Max
• Greedy Algorithms
– Change

CSCE 235 Algorithms: An Introduction 2


Computer Science is About Problem Solving
• A Problem is specified by
1. The givens (a formulation)
• A set of objects
• Relations between them
2. The query
• The information one wants to extract from the formulation, the question to answer

Real World  Computing World


Objects represented by… data Structures, ADTs, Classes
Relations implemented with… relations & functions (e.g., predicates)
Actions Implemented with… algorithms: a sequence of instructions

• An algorithm is a method or procedure that solves instances of a problem

CSCE 235 Algorithms: An Introduction 3


Algorithms: Formal Definition
• Definition: An algorithm is a sequence of unambiguous
instructions for solving a problem.
• Properties of an algorithm
– Finite: the algorithm must eventually terminate
– Complete: Always give a solution when one exists
– Correct (sound): Always give a correct solution
• For an algorithm to be an acceptable solution to a problem, it
must also be effective. That is, it must give a solution in a
‘reasonable’ amount of time
• Efficient= runs in polynomial time. Thus, effective efficient
• There can be many algorithms to solve the same problem

CSCE 235 Algorithms: An Introduction 4


Outline
• Introduction & definition
• Algorithms categories & types
• Pseudo-code
• Designing an algorithm
– Example: Max
• Greedy Algorithms
– Change

CSCE 235 Algorithms: An Introduction 5


Algorithms: General Techniques
• There are many broad categories of algorithms
– Deterministic versus Randomized (e.g., Monte Carlo)
– Exact versus Approximation
– Sequential/serial versus Parallel, etc.
• Some general styles of algorithms include
– Brute force (enumerative techniques, exhaustive search)
– Divide & Conquer
– Transform & Conquer (reformulation)
– Greedy Techniques

CSCE 235 Algorithms: An Introduction 6


Outline
• Introduction & definition
• Algorithms categories & types
• Pseudo-code
• Designing an algorithm
– Example: Max
• Greedy Algorithms
– Change

CSCE 235 Algorithms: An Introduction 7


Good Pseudo-Code: Example
Intersection
Input: Two finite sets A, B
Output: A finite set C such that C = A  B
1. C
2. If |A|>|B| Then Swap(A,B)
3. For every x  A Do
4. If x  B Then C  C  {x} Union(C,{x})
5. End
6. Return C

CSCE 235 Algorithms: An Introduction 8


Algorithms: Pseudo-Code
• Algorithms are usually presented using pseudo-code
• Bad pseudo-code
– Gives too many details or
– Is too implementation specific (i.e., actual C++ or Java code or giving
every step of a sub-process such as set union)
• Good pseudo-code
– Is a balance between clarity and detail
– Abstracts the algorithm
– Makes good use of mathematical notation
– Is easy to read and
– Facilitates implementation (reproducible, does not hide away
important information)

CSCE 235 Algorithms: An Introduction 9


Writing Pseudo-Code: Advice
• Input/output must properly defined
• All your variables must be properly initialized, introduced
• Variables are instantiated, assigned using 
• All ‘commands’ (while, if, repeat, begin, end) boldface \bf
For i  1 to n Do
• All functions in small caps Union(s,t) \sc
• All constants in courier: pi  3.14 \tt
• All variables in italic: temperature  78 \mathit{}
• LaTeX: Several algorithm formatting packages exist on WWW

CSCE 235 Algorithms: An Introduction 10


Outline
• Introduction & definition
• Algorithms categories & types
• Pseudo-code
• Designing an algorithm
– Example: Max
• Greedy Algorithms
– Change

CSCE 235 Algorithms: An Introduction 11


Designing an Algorithm
• A general approach to designing algorithms is as follows
– Understanding the problem, assess its difficulty
– Choose an approach (e.g., exact/approximate, deterministic/
probabilistic)
– (Choose appropriate data structures)
– Choose a strategy
– Prove
1. Termination
2. Completeness
3. Correctness/soundness
– Evaluate complexity
– Implement and test it
– Compare to other known approach and algorithms

CSCE 235 Algorithms: An Introduction 12


Algorithm Example: Max
• When designing an algorithm, we usually give a
formal statement about the problem to solve
• Problem
– Given: a set A={a1,a2,…,an} of integers
– Question: find the index i of the maximum integer ai
• A straightforward idea is
– Simply store an initial maximum, say a1
– Compare the stored maximum to every other integer in A
– Update the stored maximum if a new maximum is ever
encountered
CSCE 235 Algorithms: An Introduction 13
Pseudo-code of Max
Max
Input: A finite set A={a1,a2,…,an} of integers
Output: The largest element in the set
1. temp a1
2. For to Do
3. If
4. Then
5. End
6. End
7. Return

CSCE 235 Algorithms: An Introduction 14


Algorithms: Other Examples
• Check Bubble Sort and Insertion Sort in your
textbooks
• … which you should have seen ad nauseum in
CSE 155 and CSE 156
• And which you will see again in CSE 310
• Let us know if you have any questions

CSCE 235 Algorithms: An Introduction 15


Outline
• Introduction & definition
• Algorithms categories & types
• Pseudo-code
• Designing an algorithm
– Example: Max
• Greedy Algorithms
– Change

CSCE 235 Algorithms: An Introduction 16


Greedy Algorithms
• In many problems, we wish to not only find a solution, but to
find the best or optimal solution
• A simple technique that works for some optimization
problems is called the greedy technique
• As the name suggests, we solve a problem by being greedy
– Choose what appears now to be the best choice
– Choose the most immediate best solution (i.e., think locally)
• Greedy algorithms
– Work well on some (simple) problems
– Usually they are not guaranteed to produce the best globally optimal
solution

CSCE 235 Algorithms: An Introduction 17


Change-Making Problem
• We want to give change to a customer but we
want to minimize the number of total coins
we give them
• Problem
– Given: An integer n an a set of coin denominations
(c1,c2,…,cr) with c1>c2>…>cr
– Query: Find a set of coins d1,d2,…,dr such that
and minimized

CSCE 235 Algorithms: An Introduction 18


Greedy Algorithm: Change

Change
Input: An integer n and a set of coin denominations {c1,c2,…,cr}
with c1 > c2> … >cr
Output: A set of coins d1,d2,…,dr such that , minimized
1. For Do
2.
3. While Do
4.
5.
6. End
7. Return

CSCE 235 Algorithms: An Introduction 19


Change: Analysis (1)
• Will the algorithm always produce an optimal
answer?
• Example
– Consider a coinage system where c1=20, c2=15, c3=7, c4=1
– We want to give 22 ‘cents’ in change
• What is the output of the algorithm?
• Is it optimal?
• It is not optimal because it would give us two c4 and one c1 (3
coins). The optimal change is one c2 and one c3 (2 coins)
CSCE 235 Algorithms: An Introduction 20
Optimality of Change (1)
• How about the US currency : c1=25, c2=10, c3=5, c4=1, is the algorithm
correct in this case?
• Yes, in fact it is. We prove it by contradiction and need the following
lemma
• If is a positive integer, then cents in change using quarters, dimes, nickels,
and pennies using the fewest coins possible
– Has at most two dimes
– Has at most one nickel
– Has at most four pennies, and
– Cannot have two dimes and a nickel
The amount of change in dimes, nickels, and pennies cannot exceed 24
cents

CSCE 235 Algorithms: An Introduction 21


Optimality of Change (2)
• Assume:
– q is the number of quarters returned by the greedy algorithm
– q’ is the number of quarters returned by the optimal solution
• Three cases:
1. . The greedy algorithm chooses the largest number of quarters possible, by
construction. Thus, it is impossible to q’>q.
2. . Since the greedy algorithms uses as many quarters as possible, , where . If ,
then, where . C’ will have to use more smaller coins to make up for the large
r’(see Lemma). Thus C’ is not the optimal solution.

3. then we continue the argument on the smaller denomination (e.g., dimes).


Eventually, we reach a contradiction.

• Thus, the greedy algorithm gives the optimal solution

CSCE 235 Algorithms: An Introduction 22


Greedy Algorithm: Another Example
• Check the problem of Scenario I, page 25 in the slides
IntroductiontoCSE235.ppt

• We discussed then (remember?) a greedy algorithm for


accommodating the maximum number of customers. The
algorithm
– terminates, is complete, sound, and satisfies the maximum number of
customers (finds an optimal solution)
– runs in time linear in the number of customers

CSCE 235 Algorithms: An Introduction 23


Summary
• Introduction & definition
• Algorithms categories & types
• Pseudo-code
• Designing an algorithm
– Example: Max
• Greedy Algorithms
– Example: Change

CSCE 235 Algorithms: An Introduction 24


Algorithms Analysis

Section 3.3 of Rosen


Outline
• Introduction
• Input Size
• Order of Growth
• Intractability
• Worst, Best, and Average Cases
• Mathematical Analysis of Algorithms
– 3 Examples
• Summation tools
CSCE 235 Algorithms: An Introduction 26
Introduction
• How can we say that one algorithm performs better than another one?
• Quantify the resources needed to run it:
– Time
– Memory
– I/O, disk access
– Circuit, power, etc.
• We want to study algorithms independent of
– Implementations
– Platforms
– Hardware
• We need an objective point of reference
– For that we measure time by the number of operations as a function of the
size of the input to the algorithm
– Time is not merely CPU clock cycle
CSCE 235 Algorithms: An Introduction 27
Input Size
• For a given problem, we characterize the input size n
appropriately
– Sorting: The number of items to be sorted
– Graphs: The number of vertices and/or edges
– Matrix manipulation: The number of rows and colums
– Numerical operations: the number of bits needed to represent a number
• The choice of an input size greatly depends on the elementary
operation: the most relevant or important operation of an
algorithm
– Comparisons
– Additions
– Multiplications

CSCE 235 Algorithms: An Introduction 28


Outline
• Introduction
• Input Size
• Order of Growth
• Intractability
• Worst, Best, and Average Cases
• Mathematical Analysis of Algorithms
– 3 Examples
• Summation tools
CSCE 235 Algorithms: An Introduction 29
Order of Growth (of Algorithms)
• Small input sizes can usually be computed
instantaneously, thus we are most interested in how
an algorithms performs as n
• Indeed, for small values of n, most such functions will
be very similar in running time.
• Only for sufficiently large n do differences in running
time become apparent:
As n the differences become more and more stark

CSCE 235 Algorithms: An Introduction 30


Intractability (of Problems)
• Problems that we can solve (today) only with exponential or super-
exponential time algorithms are said to be (likely) intractable. That is,
though they may be solved in a reasonable amount of time for small n, for
large n, there is (likely) no hope for efficient execution. It may take
millions or billions of years.
• Tractable problems are problems that have efficient (read: polynomial)
algorithms to solve them.
• Polynomial order of magnitude usually means that there exists a
polynomial p(n)=nk for some constant k that always bounds the order of
growth. More on asymptotics in the next lecture
• (Likely) Intractable problems (may) need to be solved using approximation
or randomized algorithms (except for small size of input)

CSCE 235 Algorithms: An Introduction 31


Worst, Best, and Average Case (of Algorithms)

• Some algorithms perform differently on various input of


similar size. It is sometimes helpful to consider
– The worst-case
– The best-case
– The average-case
Performance of the algorithm.
• Example: Search an array A of size n for a given value k
– Worst-case: k  A, then we must check every item. Cost = n
comparisons
– Best-case: k is the first item in the array. Cost = 1 comparison
– Average-case: Probabilistic analysis

CSCE 235 Algorithms: An Introduction 32


Average-Case: Example
• Since any worthwhile algorithm will be used quite extensively, the average
running time is arguably the best measure of the performance of the
algorithm (if the worst case is not frequently encountered).
• For searching an array and assuming that p is the probability of a
successful search we have
Average cost of success: (1 + 2 + … + n)/n operations
Cost of failure: n operations
Caverage(n) = Cost(success).Prob(success) + Cost(failure).Prob(failure)
= (1 + 2 + … + i + n) p/n + n(1-p)
= p/n + n (1-p) = + n (1-p)
– If p = 0 (search fails), Caverage(n) = n
– If p = 1 (search succeeds), Caverage(n) = (n+1)/2  n/2
Intuitively, the algorithm must examine on average half of all the elements in A

CSCE 235 Algorithms: An Introduction 33


Average-Case: Importance
• Average-case analysis of algorithms is
important in a practical sense
• Often Cavg and Cworst have the same order of
magnitude and thus from a theoretical point
of view, are no different from each other
• Practical implementations, however, require a
real-world examination and empirical analysis

CSCE 235 Algorithms: An Introduction 34


Outline
• Introduction
• Input Size
• Order of Growth
• Intractability
• Worst, Best, and Average Cases
• Mathematical Analysis of Algorithms
– 3 Examples
• Summation tools
CSCE 235 Algorithms: An Introduction 35
Mathematical Analysis of Algorithms
• After developing a pseudo-code for an algorithm, we wish to
analyze its performance
– as a function of the size of the input, n,
– in terms of how many times the elementary operation is performed.
• Here is a general strategy
1. Decide on a parameter(s) for the input, n
2. Identify the basic operation
3. Evaluate if the elementary operation depends only on n
4. Set up a summation corresponding to the number of elementary
operations
5. Simplify the equation to get as simple of a function f(n) as possible

CSCE 235 Algorithms: An Introduction 36


Algorithm Analysis: Example 1 (1)
UniqueElements
Input: Integer array A of size n
Output: True if all elements a  A are distinct
1. For Do
2. For Do 𝑖
3. If
4. Then Return false 1 2 …
5. End
6. End 𝑗
7. End
8. Return true

CSCE 235 Algorithms: An Introduction 37


Algorithm Analysis: Example 1 (2)
• For this algorithm, what is
– The elementary operation? – Comparing ai and aj
– Input size? – n, size of A
– Does the elementary operation depend only on n?

• The outer for-loop runs n-1 times. More


formally it contributes:
• The inner for-loop depends on the outer for-loop and
contributes:

CSCE 235 Algorithms: An Introduction 38


Algorithm Analysis: Example 1 (3)
• We observe that the elementary operation is
executes once in each iteration, thus we have
Cworst(n) = = =

n-1

CSCE 235 Algorithms: An Introduction 39


Computing
• = 1+1+1+1= 5-2+1, =

• Computing
– Check Table 2, page 157:
– Rewrite =
• Finally, = =

CSCE 235 Algorithms: An Introduction 40


Algorithm Analysis: Example 2 (1)
• The parity of a bit string determines whether
or not the number of 1s in it is even or odd.
• It is used as a simple form of error correction
over communication networks

CSCE 235 Algorithms: An Introduction 41


Algorithm Analysis: ParityChecking
ParityChecking
Input: An integer n in binary (as an array b[])
Output: 0 if parity is even, 1 otherwise
1. parity0
2. While n>0 Do
3. If b[0]=1 Then
4. parity parity +1 mod 2
5. End
6. LeftShift(n)
7. End
8. Return parity

CSCE 235 Algorithms: An Introduction 42


Algorithm Analysis: Example 2 (2)
• For this algorithm, what is
– The elementary operation?
– Input size, n?
– Does the elementary operation depend only on n?
• The number of bits required to represent an integer n is log n
• The while-loop will be executed as many times as there are 1-bits
in the binary representation.
• In the worst case we have a bit string of all 1s
• So the running time is simply log n

CSCE 235 Algorithms: An Introduction 43


Algorithm Analysis: Example 3 (1)
MyFunction
Input: Integers n,m,p such that n>m>p
Output: Some function f(n,m,p)

1. For Do
2. For Do
3. For Do
4.
5. End
6. End
7. End
8. Return

CSCE 235 Algorithms: An Introduction 44


Algorithm Analysis: Example 3 (2)
• Outer for-loop: executes 11 times, but does
not depend on input size
• 2nd for-loop: executes n+1 times
• 3rd for-loop: executes m+1 times
• Thus, the cost is C(n,m,p)=11(n+1)(m+1)
• And we do NOT need to consider p

CSCE 235 Algorithms: An Introduction 45


Outline
• Introduction
• Input Size
• Order of Growth
• Intractability
• Worst, Best, and Average Cases
• Mathematical Analysis of Algorithms
– 3 Examples
• Summation tools
CSCE 235 Algorithms: An Introduction 46
Summation Tools
• Table 2, Section 2.4 (page 166) has more summation rules,
which will be
• You can always use Maple to evaluate and simplify complex
expressions
– But you should know how to do them by hand!
• To use Maple on cse you can use the command-line interface
by typing maple
• Under Unix (gnome of KDE) or via xwindows interface, you
can use the graphical version via xmaple

CSCE 235 Algorithms: An Introduction 47


Summation Tools: Maple
> simplify(sum(i,i=0..n));
½ n2+ ½ n
> Sum(Sum(j,j=i..n),i=0..n);
i=0n (j=in j)

CSCE 235 Algorithms: An Introduction 48


Summary
• Introduction
• Input Size
• Order of Growth
• Intractability
• Worst, Best, and Average Cases
• Mathematical Analysis of Algorithms
– 3 Examples
• Summation tools
CSCE 235 Algorithms: An Introduction 49
Asymptotics

Section 3.2 of Rosen


Outline
• Introduction
• Asymptotic
– Definitions (Big O, Omega, Theta), properties
• Proof techniques
– 3 examples, trick for polynomials of degree 2,
– Limit method (l’Hôpital Rule), 2 examples
• Limit Properties
• Complexity of algorithms
• Conclusions

CSCE 235 Algorithms: An Introduction 51


Introduction (1)
• In practice, specific hardware, implementation,
languages, etc. greatly affect how the algorithm
behave
• Our goal is to study and analyze the behavior of
algorithms in and of themselves, independently of
such factors
• We have seen how to mathematically evaluate the
cost functions of algorithms with respect to
– their input size and
– their elementary operations
CSCE 235 Algorithms: An Introduction 52
Introduction (2)
• However, it suffices to simply measure a cost function’s
asymptotic behavior
• We are interested only in the Order of Growth of an
algorithm’s complexity
• How well does the algorithm perform as the size of the input
grows: n 
• For example
– An algorithm that executes its elementary operation 10000n times is
better than one that executes it 5n2 times
– Also, algorithms that have running time n2 and 2000n2 are considered
asymptotically equivalent

CSCE 235 Algorithms: An Introduction 53


Introduction (3): Magnitude Graph

CSCE 235 Algorithms: An Introduction 54


Outline
• Introduction
• Asymptotic
– Definitions (Big-O, Omega, Theta), properties
• Proof techniques
• Limit Properties
• Efficiency classes
• Conclusions

CSCE 235 Algorithms: An Introduction 55


Big-O Definition
• Definition: Let f and g be two functions f,g:NR+. We say
that
f(n)  O(g(n))
(read: f is Big-O of g) if there exists a constant c  R+ and an
noN such that for every integer n  n0 we have
f(n)  cg(n)

• Big-O is actually Omicron, but it suffices to write “O”


Intuition: f is asymptotically less than or equal to g
• Big-O gives an asymptotic upper bound \mathcal{O}

CSCE 235 Algorithms: An Introduction 56


Big-Omega Definition
• Definition: Let f and g be two functions f,g: NR+. We say
that
f(n)  (g(n))
(read: f is Big-Omega of g) if there exists a constant c  R+ and
an noN such that for every integer n  n0 we have
f(n)  cg(n)

• Intuition: f is asymptotically greater than or equal to g


• Big-Omega gives an asymptotic lower bound

\Omega()
CSCE 235 Algorithms: An Introduction 57
Big-Theta Definition
• Definition: Let f and g be two functions f,g: NR+. We say
that
f(n)  (g(n))
(read: f is Big-Theta of g) if there exists a constant c1, c2  R+
and an noN such that for every integer n  n0 we have
c1g(n)  f(n)  c2g(n)

• Intuition: f is asymptotically equal to g


• f is bounded above and below by g
• Big-Theta gives an asymptotic equivalence \Theta ()

CSCE 235 Algorithms: An Introduction 58


Asymptotic Properties (1)
• Theorem: For f1(n)  O(g1(n)) and f2(n)  O(g2(n)), we have
f1(n) + f2(n)  O( max{g1(n), g2(n)} )
• This property implies that we can ignore lower order terms.
In particular, for any polynomial with degree k such as
p(n)=ank + bnk-1 + cnk-2 + …,
p(n)  O(nk)
More accurately, p(n)  (nk)
• In addition, this theorem gives us a justification for ignoring
constant coefficients. That is for any function f(n) and a
positive constant c
cf(n)  (f(n))
CSCE 235 Algorithms: An Introduction 59
Asymptotic Properties (2)
• Some obvious properties also follow from the
definitions
• Corollary: For positive functions f(n) and g(n)
the following hold:
– f(n)  (g(n))  f(n)  O(g(n)) ⋀ f(n)  (g(n))
– f(n)  O(g(n))  g(n)  (f(n))
The proof is obvious and left as an exercise

CSCE 235 Algorithms: An Introduction 60


Outline
• Introduction
• Asymptotic
– Definitions (big O, Omega, Theta), properties
• Proof techniques
– 3 examples, trick for polynomials of degree 2
– Limit method (l’Hôpital Rule), 2 examples
• Limit Properties
• Efficiency classes
• Conclusions

CSCE 235 Algorithms: An Introduction 61


Asymptotic Proof Techniques
• Proving an asymptotic relationship between two given
function f(n) and g(n) can be done intuitively for most of the
functions you will encounter; all polynomials for example
• However, this does not suffice as a formal proof
• To prove a relationship of the form f(n)(g(n)), where  is ,
, or , can be done using the definitions, that is
– Find a value for c (or c1 and c2)
– Find a value for n0
(But the above is not the only way.)

CSCE 235 Algorithms: An Introduction 62


Asymptotic Proof Techniques: Example A

Example: Let f(n)=21n2+n and g(n)=n3


• Our intuition should tell us that f(n)  O(g(n))
• Simply using the definition confirms this:
21n2+n  cn3
holds for say c=3 and for all nn0=8
• So we found a pair c=3 and n0=8 that satisfy the
conditions required by the definition QED
• In fact, an infinite number of pairs can satisfy this
equation
CSCE 235 Algorithms: An Introduction 63
Asymptotic Proof Techniques: Example B (1)

• Example: Let f(n)=n2+n and g(n)=n3. Find a


tight bound of the form
f(n)  (g(n))
• Our intuition tells us that f(n)O(g(n))
• Let’s prove it formally

CSCE 235 Algorithms: An Introduction 64


Example B: Proof
• If n1 it is clear that
1. n  n3 and
2. n2  n3
• Therefore, we have, as 1. and 2.:
n2+n  n3 + n3 = 2n3
• Thus, for n0=1 and c=2, by the definition of
Big-O we have that f(n)=n2+n  O(g(n3))

CSCE 235 Algorithms: An Introduction 65


Asymptotic Proof Techniques: Example C (1)

• Example: Let f(n)=n3+4n2 and g(n)=n2. Find a


tight bound of the form
f(n)  (g(n))
• Here, 0ur intuition tells us that f(n)(g(n))
• Let’s prove it formally

CSCE 235 Algorithms: An Introduction 66


Example C: Proof
• For n1, we have n2  n3
• For n0, we have n3  n3 + 4n2

• Thus n1, we have n2  n3  n3 + 4n2


• Thus, by the definition of Big- , for n0=1 and
c=1 we have that f(n)=n3+4n2  (g(n2))

CSCE 235 Algorithms: An Introduction 67


Asymptotic Proof Techniques:
Trick for polynomials of degree 2
• If you have a polynomial of degree 2 such as

you can prove that it is using the following


values
1.

2.

CSCE 235 Algorithms: An Introduction 68


Outline
• Introduction
• Asymptotic
– Definitions (big O, Omega, Theta), properties
• Proof techniques
– 3 examples, trick for polynomials of degree 2,
– Limit method (l’Hôpital Rule), 2 examples
• Limit Properties
• Efficiency classes
• Conclusions

CSCE 235 Algorithms: An Introduction 69


Limit Method: Motivation
• Now try this one:
f(n)= n50+12n3log4n –1243n12
+ 245n6logn + 12log3n – logn
g(n)= 12 n50 + 24 log14 n43 – logn/n5 +12
• Using the formal definitions can be very tedious
especially one has very complex functions
• It is much better to use the Limit Method, which uses
concepts from Calculus

CSCE 235 Algorithms: An Introduction 70


Limit Method: The Process
• Say we have functions f(n) and g(n). We set up a limit quotient
between f and g as follows
0 Then f(n)O(g(n))
lim = c>0 Then f(n)(g(n))
n

 Then f(n)(g(n))

• The above can be proven using calculus, but for our purposes,
the limit method is sufficient for showing asymptotic inclusions
• Always try to look for algebraic simplifications first
• If and both diverge or converge on zero or infinity, then you
need to apply the l’Hôpital Rule
CSCE 235 Algorithms: An Introduction 71
(Guillaume de) L’Hôpital Rule
• Theorem (L’Hôpital Rule):
– Let f and g be two functions,
– if the limit between the quotient exists,
– Then, it is equal to the limit of the derivative of
the numerator and the denominator
limn = limn

CSCE 235 Algorithms: An Introduction 72


Useful Derivatives
• Some useful derivatives that you should memorize



(product rule)

–  careful!

CSCE 235 Algorithms: An Introduction 73


Useful log Identities
• Change of base formula:

CSCE 235 Algorithms: An Introduction 74


L’Hôpital Rule: Justification (1)
• Why do we have to use L’Hôpital’s Rule?
• Consider the following function

• Clearly sin 0=0. So you may say that when


x0, f(x)  0
• However, the denominator is also  0, so you
may say that f(x)  
• Both are wrong
CSCE 235 Algorithms: An Introduction 75
L’Hôpital Rule: Justification (2)
• Observe the graph of f(x)= (sin x)/x = sinc x

CSCE 235 Algorithms: An Introduction 76


L’Hôpital Rule: Justification (3)
• Clearly, though f(x) is undefined at x=0, the
limit still exists
• Applying the L’Hôpital Rule gives us the
correct answer
lim x0 ((sin x )/x) = lim x0 (sin x )’/x’ = cos x/1 = 1

CSCE 235 Algorithms: An Introduction 77


Limit Method: Example 1
• Example: Let f(n) =2n, g(n)=3n. Determine a
tight inclusion of the form f(n)  (g(n))
• What is your intuition in this case? Which
function grows quicker?

CSCE 235 Algorithms: An Introduction 78


Limit Method: Example 1—Proof A
• Proof using limits
• We set up our limit: =

• Using L’Hôpital Rule gets you nowhere

• Both the numerator and denominator still diverge.


We’ll have to use an algebraic simplification

CSCE 235 Algorithms: An Introduction 79


Limit Method: Example 1—Proof B
• Using algebra = limn (2/3)n

• Now we use the following Theorem w/o proof


0 if  < 1
limn n = 1 if  = 1
 if  > 1
• Therefore we conclude that the limn (2/3)n
converges to zero thus 2n  O(3n)
CSCE 235 Algorithms: An Introduction 80
Limit Method: Example 2 (1)
• Example: Let f(n) =log2n, g(n)=log3n2.
Determine a tight inclusion of the form
f(n)  (g(n))
• What is your intuition in this case?

CSCE 235 Algorithms: An Introduction 81


Limit Method: Example 2 (2)
• We prove using limits
• We set up out limit
limn f(n)/g(n) = limn log2n/log3n2
= limn log2n/(2log3n)
• Here we use the change of base formula for
logarithms: logxn = logy n/logy x
• Thus: log3n = log2n / log23

CSCE 235 Algorithms: An Introduction 82


Limit Method: Example 2 (3)
• Computing our limit:
limn log2n/(2log3n) = limn log2n log23 /(2log2n)
= limn (log23)/2
= (log23) /2
 0.7924, which is a positive constant
• So we conclude that f(n)(g(n))

CSCE 235 Algorithms: An Introduction 83


Outline
• Introduction
• Asymptotic
– Definitions (big O, Omega, Theta), properties
• Proof techniques
– 3 examples, trick for polynomials of degree 2,
– Limit method (l’Hôpital Rule), 2 examples
• Limit Properties
• Efficiency classes
• Conclusions

CSCE 235 Algorithms: An Introduction 84


Limit Properties
• A useful property of limits is that the
composition of functions is preserved
• Lemma: For the composition  of addition,
subtraction, multiplication and division, if the
limits exist (that is, they converge), then
limn f1(n)  limn f2(n) = limn (f1(n)  f2(n))

CSCE 235 Algorithms: An Introduction 85


Complexity of Algorithms—Table 1, page 226

• Constant O(1)
• Logarithmic O(log (n))
• Linear O(n)
• Polylogarithmic O(logk (n))
• Quadratic O(n2)
• Cubic O(n3)
• Polynominal O(nk) for any k>0
• Exponential O(kn), where k>1
• Factorial O(n!)
CSCE 235 Algorithms: An Introduction 86
Conclusions
• Evaluating asymptotics is easy, but remember:
– Always look for algebraic simplifications
– You must always give a rigorous proof
– Using the limit method is (almost) always the best
– Use L’Hôpital Rule if need be
– Give as simple and tight expressions as possible

CSCE 235 Algorithms: An Introduction 87


Summary
• Introduction
• Asymptotic
– Definitions (big O, Omega, Theta), properties
• Proof techniques
– 3 examples, trick for polynomials of degree 2,
– Limit method (l’Hôpital Rule), 2 examples
• Limit Properties
• Efficiency classes
• Conclusions

CSCE 235 Algorithms: An Introduction 88

You might also like