You are on page 1of 19

Algorithms

Fundamentals

Input Algorithm Output


An algorithm is a step-by-step procedure for
solving a problem in a finite amount of time.

Input
Output
Definiteness
Finiteness
Correctness
First example: Linear Search(A,n)
Input: A- array of integers, n – an integer
Output: location of n if n is present in the array A

i=0
while A[i]!=n && i <size(A)
i=i+1
If i< size(A) return i
Else return -1
Second example: Binary Search(A,n)
Input: A- sorted array of integers, n – an integer
Output: location of n if n is present in the array A

i=0, j=size(A)
while i < j
Mid=(i+j)/2
If A[(i+j)/2] <= n
j=mid
Else
i=mid+1
If A[i]==n return i
Else return -1
Third example: Bubble Sort(A,n)
Input: A- array of integers, n – size of the array
Output: array A where numbers are arranged in ascending order

For (i=0;i<n;i++)
For (j=0;j<n-i;j++)
If A[j] > A[j+1]
swap A[j] and A[j+1]
The Random Access Machine (RAM)
Model
• A CPU

• An potentially unbounded 2
bank of memory cells, 0
1

each of which can hold an


arbitrary number or
character

Memory cells are numbered and accessing


any cell in memory takes unit time.
6
Primitive Operations
• Basic computations • Examples:
performed by an algorithm • Evaluating an
expression
• Largely independent from • Assigning a value to a
the programming language variable
• Indexing into an array
• Assumed to take a constant • Calling a method
amount of time in the RAM • Returning from a
model method
Counting Primitive Operations
• By inspecting the pseudocode, we can determine the
maximum number of primitive operations executed by an
algorithm, as a function of the input size

Algorithm arrayMax(A, n)
# operations
currentMax  A[0] 1
for i  1 to n  1 do 1+n
if A[i]  currentMax then (n  1)
currentMax  A[i] (n  1)
{ increment counter i } (n  1)
return currentMax 1
Total 4n
Growth Rates
1E+30
• Growth rates of 1E+28 Cubic
functions: 1E+26
1E+24 Quadratic
• Linear  n 1E+22
Linear
• Quadratic  n2 1E+20
1E+18
• Cubic  n3

T (n )
1E+16
1E+14
1E+12
• In a log-log chart, 1E+10
the slope of the 1E+8
1E+6
line corresponds to 1E+4
the growth rate of 1E+2
the function 1E+0
1E+0 1E+2 1E+4 1E+6 1E+8 1E+10
n
Constant Factors
1E+26
• The growth rate is 1E+24 Quadratic

not affected by 1E+22 Quadratic


1E+20 Linear
• constant factors or
1E+18 Linear
• lower-order terms 1E+16
• Examples 1E+14

T (n )
1E+12
• 102n + 105 is a 1E+10
linear function 1E+8
• 105n2 + 108n is a 1E+6
1E+4
quadratic function
1E+2
1E+0
1E+0 1E+2 1E+4 1E+6 1E+8 1E+10
n
Goal: to simplify analysis by getting rid of
unneeded information (like “rounding”
1,000,001≈1,000,000)
Asymptotic Notation
Big-Oh Notation

Given functions f(n) and


g(n), we say that f(n) is
O(g(n)) if there are
positive constants
c and n0 such that
f(n)  cg(n) for n  n0
Example: 2n + 10 is O(n)
2n + 10  cn
(c  2) n  10
n  10/(c  2)
Pick c = 3 and n0 = 10
Big-Oh Rules
If is f(n) a polynomial of degree d, then f(n) is
O(nd), i.e.,
1.Drop lower-order terms
2.Drop constant factors

Use the smallest possible class of functions


“2n is O(n)” instead of “2n is O(n2)”

Use the simplest expression of the


class
“3n + 5 is O(n)” instead of O(3n)
Asymptotic Notation
Big-Omega Notation

Given functions f(n) and


g(n), we say that f(n) is
(g(n)) if there are
positive constants
c and n0 such that
f(n) >= cg(n) for n  n0
Example: 2n + 10 is (n)
2n + 10 >= cn
Pick c = 1 and n0 = 1
Asymptotic Notation
Theta Notation

Given functions f(n) and g(n),


we say that f(n) is (g(n)) if
there are positive constants
c_1,c_2 and n0 such that
c_1g(n) <= f(n) <= c_2g(n)
for n  n0
Example: 2n + 10 is (n)
Problem
Order the following functions by their asymptotic growth
rates
• nlogn
• logn3
• n2
• n2/5
• 2logn
• log(logn)
• Sqr(logn)
Solvable problems:
1. Tractable problems:
Solved by an algorihm with polynomial time worst case complexity.

2. Intractable problems:
Cannot be solved by an algorihm with polynomial time worst case complexity.

Unsolvable problems
-- Halting problem
Worst case complexity
Number of operations needed for the algorithm to guarantee output in the worst
case

Average case complexity

Average of the “Number of operations needed for the algorithm on different


inputs” to guarantee output
Linear Search(A,n)
Input: A- array of integers, n – an integer
Output: location of n if n is present in the array A

i=0
while A[i]!=n && i <size(A)
i=i+1
If i< size(A) return i
Else return -1

You might also like