## Friday, 28 June 2013

### Complexities an birds eye

 `Good` `Fair` `Poor`

## Searching

AlgorithmData StructureTime ComplexitySpace Complexity

AverageWorstWorst
Depth First Search (DFS)Graph of |V| vertices and |E| edges`-``O(|E| + |V|)``O(|V|)`
Breadth First Search (BFS)Graph of |V| vertices and |E| edges`-``O(|E| + |V|)``O(|V|)`
Binary searchSorted array of n elements`O(log(n))``O(log(n))``O(1)`
Linear (Brute Force)Array`O(n)``O(n)``O(1)`
Shortest path by Dijkstra,
using a Min-heap as priority queue
Graph with |V| vertices and |E| edges`O((|V| + |E|) log |V|)``O((|V| + |E|) log |V|)``O(|V|)`
Shortest path by Dijkstra,
using an unsorted array as priority queue
Graph with |V| vertices and |E| edges`O(|V|^2)``O(|V|^2)``O(|V|)`
Shortest path by Bellman-FordGraph with |V| vertices and |E| edges`O(|V||E|)``O(|V||E|)``O(|V|)`

## Sorting

AlgorithmData StructureTime ComplexityWorst Case Auxiliary Space Complexity

BestAverageWorstWorst
QuicksortArray`O(n log(n))``O(n log(n))``O(n^2)``O(n)`
MergesortArray`O(n log(n))``O(n log(n))``O(n log(n))``O(n)`
HeapsortArray`O(n log(n))``O(n log(n))``O(n log(n))``O(1)`
Bubble SortArray`O(n)``O(n^2)``O(n^2)``O(1)`
Insertion SortArray`O(n)``O(n^2)``O(n^2)``O(1)`
Select SortArray`O(n^2)``O(n^2)``O(n^2)``O(1)`
Bucket SortArray`O(n+k)``O(n+k)``O(n^2)``O(nk)`
Radix SortArray`O(nk)``O(nk)``O(nk)``O(n+k)`

## Heaps

Time Complexity

HeapifyFind MaxExtract MaxIncrease KeyInsertDeleteMerge
Linked List (sorted)`-``O(1)``O(1)``O(n)``O(n)``O(1)``O(m+n)`
Linked List (unsorted)`-``O(n)``O(n)``O(1)``O(1)``O(1)``O(1)`
Binary Heap`O(n)``O(1)``O(log(n))``O(log(n))``O(log(n))``O(log(n))``O(m+n)`
Binomial Heap`-``O(log(n))``O(log(n))``O(log(n))``O(log(n))``O(log(n))``O(log(n))`
Fibonacci Heap`-``O(1)``O(log(n))*``O(1)*``O(1)``O(log(n))*``O(1)`

## Graphs

Adjacency list`O(|V|+|E|)``O(1)``O(1)``O(|V| + |E|)``O(|E|)``O(|V|)`
Incidence list`O(|V|+|E|)``O(1)``O(1)``O(|E|)``O(|E|)``O(|E|)`
Adjacency matrix`O(|V|^2)``O(|V|^2)``O(1)``O(|V|^2)``O(1)``O(1)`
Incidence matrix`O(|V| ⋅ |E|)``O(|V| ⋅ |E|)``O(|V| ⋅ |E|)``O(|V| ⋅ |E|)``O(|V| ⋅ |E|)``O(|E|)`

## Notation for asymptotic growth

letterboundgrowth
(theta) Θupper and lower, tight[1]equal[2]
(big-oh) Oupper, tightness unknownless than or equal[3]
(small-oh) oupper, not tightless than
(big omega) Ωlower, tightness unknowngreater than or equal
(small omega) ωlower, not tightgreater than
[1] Big O is the upper bound, while Omega is the lower bound. Theta requires both Big O and Omega, so that's why it's referred to as a tight bound (it must be both the upper and lower bound). For example, an algorithm taking Omega(n log n) takes at least n log n time but has no upper limit. An algorithm taking Theta(n log n) is far preferential since it takes AT LEAST n log n (Omega n log n) and NO MORE THAN n log n (Big O n log n).SO
[2] f(x)=Θ(g(n)) means f (the running time of the algorithm) grows exactly like g when n (input size) gets larger. In other words, the growth rate of f(x) is asymptotically proportional to g(n).
[3] Same thing. Here the growth rate is no faster than g(n). big-oh is the most useful because represents the worst-case behavior.
In short, if algorithm is __ then its performance is __
algorithmperformance
o(n)< n
O(n)≤ n
Θ(n)= n
Ω(n)≥ n
ω(n)> n