You are on page 1of 4

# Good

Fair

Poor

Searching
Space Complexity Worst
O(|E| + |V|)

Algorithm

Data Structure

Worst
O(|V|)

## Depth First Search (DFS)

Graph of |V| vertices and |E| edges Graph of |V| vertices and |E| edges Sorted array of n elements Array Graph with |V| vertices and |E| edges Graph with |V| vertices and |E| edges

O(|E| + |V|)

O(|V|)

Binary search Linear (Brute Force) Shortest path by Dijkstra, using a Min-heap as priority queue Shortest path by Dijkstra, using an unsorted array as priority queue Shortest path by Bellman-Ford

O(|V|)

## Graph with |V| vertices and |E| edges

O(|V||E|)

O(|V||E|)

O(|V|)

Sorting
Algorithm Data Structure Time Complexity Best Quicksort Mergesort Heapsort Bubble Sort Insertion Sort Select Sort Bucket Sort Radix Sort Array Array Array Array Array Array Array Array
O(n log(n)) O(n log(n)) O(n log(n)) O(n) O(n) O(n^2) O(n+k) O(nk)

## Worst Case Auxiliary Space Complexity Average

O(n log(n)) O(n log(n)) O(n log(n)) O(n^2) O(n^2) O(n^2) O(n+k) O(nk)

Worst
O(n^2) O(n log(n)) O(n log(n)) O(n^2) O(n^2) O(n^2) O(n^2) O(nk)

Worst
O(n) O(n) O(1) O(1) O(1) O(1) O(nk) O(n+k)

Data Structures
Data Structure Time Complexity Average Indexing Basic Array Dynamic Array Singly-Linked List Doubly-Linked List Skip List Hash Table Binary Search Tree Cartresian Tree B-Tree Red-Black Tree Splay Tree AVL Tree
O(1) O(1) O(n) O(n) O(log(n)) O(log(n))

## Space Complexity Worst Worst Search

O(n) O(n) O(n) O(n) O(n) O(n) O(n)

Search
O(n) O(n) O(n) O(n) O(log(n)) O(1) O(log(n))

Insertion
O(n) O(1) O(1) O(log(n)) O(1) O(log(n))

Deletion
O(n) O(1) O(1) O(log(n)) O(1) O(log(n))

Indexing
O(1) O(1) O(n) O(n) O(n) O(n)

Insertion
O(n) O(1) O(1) O(n) O(n) O(n)

Deletion
O(n) O(1) O(1) O(n) O(n) O(n) O(n) O(n) O(n) O(n) O(n log(n)) O(n) O(n)

## O(n) O(n) O(n) O(n) O(n)

Heaps
Heaps Time Complexity Heapify Linked List (sorted) Linked List (unsorted) Binary Heap Binomial Heap Fibonacci Heap
O(n) -

Find Max
O(1) O(n) O(1) O(log(n)) O(1)

Extract Max
O(1) O(n) O(log(n)) O(log(n)) O(log(n))*

Increase Key
O(n) O(1) O(log(n)) O(log(n)) O(1)*

Insert
O(n) O(1) O(log(n)) O(log(n)) O(1)

Delete
O(1) O(1) O(log(n)) O(log(n)) O(log(n))*

Merge
O(m+n) O(1) O(m+n) O(log(n)) O(1)

Graphs
Node / Edge Management Adjacency list Incidence list Adjacency matrix Incidence matrix Storage
O(|V|+|E|) O(|V|+|E|) O(|V|^2) O(|V| |E|)

O(1) O(1) O(|V|^2) O(|V| |E|)

O(1) O(1) O(1) O(|V| |E|)

Remove Vertex
O(|V| + |E|) O(|E|) O(|V|^2) O(|V| |E|)

Remove Edge
O(|E|) O(|E|) O(1) O(|V| |E|)

Query
O(|V|) O(|E|) O(1) O(|E|)

## Notation for asymptotic growth

letter (theta) (big-oh) O (small-oh) o (big omega) (small omega) bound upper and lower, tight [1] upper, tightness unknown upper, not tight lower, tightness unknown lower, not tight growth equal [2] less than or equal [3] less than greater than or equal greater than

[1] Big O is the upper bound, while Omega is the lower bound. Theta requires both Big O and Omega, so that's why it's referred to as a tight bound (it must be both the upper and lower bound). For example, an algorithm taking Omega(n log n) takes at least n log n time but has no upper limit. An algorithm taking Theta(n log n) is far preferential since it takes AT LEAST n log n (Omega n log n) and NO MORE THAN n log n (Big O n log n). SO [2] f(x)= (g(n)) means f (the running time of the algorithm) grows exactly like g when n (input size) gets larger. In other words, the growth rate of f(x) is asymptotically proportional to g(n). [3] Same thing. Here the growth rate is no faster than g(n). big-oh is the most useful because represents the worst-case behavior. In short, if algorithm is __ then its performance is __ algorithm o(n) O(n) (n) (n) (n) performance <n n =n n >n