You are on page 1of 1

Order of a function Let f and g be two functions dened on the positive integers.

. Then f (n) O(g(n)) if there exist positive constants C and n0 such that f (n) C g(n) for all n n0 . We say f is in big Oh g. Example 1. To prove 10n2 O(2n /10) it suces to prove that 10n2 100(2n /10) for all n 4 (apply the above denition with C = 100 and n0 = 4). Equivalently, we prove n2 2n for n 4. Order of a function log n O(n) and n O(n log n), but n O(log n) Hierarchy of increasing orders: O(1) O(log n) O(n) O(n log n) O(n2 ) O(n3 ) O(2n ) O(n!) f (n) is of polynomial order if f (n) O(nk ) for some k 0. Algorithms whose complexity is of polynomial order are considered to be ecient. An algorithm is called polynomial if its complexity is of polynomial order. P=NP and NP-completeness P = { decisions problems that can be solved in polynomial time} e.g. Is G bipartite? P NP = { decisions problems whose yes instances can be veried in polynomial time} e.g. Does G have a Hamiltonian cycle? NP Note: P NP. Big Open Problem: does P = NP ? NP-complete = { X NP : if X has polynomial time algorithm then every problem in NP has a polynomial time algorithm} e.g. Does G have a Hamiltonian cycle? NP-complete NP-complete problems are the hardest problems in NP.

Searching and sorting

Sequential searching algorithm Problem 2. Given a list of words w(1), . . . , w(n) nd if a given word KEY is in the list. Algorithm SequentialSearch (KEY,w(1), . . . , w(n)) for i = 1, 2, . . . , n do if KEY = w(i) then output YES and exit end-for output NO If there are n words, this takes up to n comparisons, so the time complexity is O(n), assuming the words have O(1) length. This is called worst-case complexity. (If we are lucky we might stop much earlier.) This subject will only consider worst-case complexity. Other notions of complexity exist, e.g. average-case complexity.

You might also like