- A heap is a binary tree H that stores a collection of keys at its internal nodes and satisfies two additional properties:
- Heap-Order Property
- Complete Binary Tree Property
- Heap-Order Property (relational): In a heap H, for every node v except the root, the key stored in v is greater than or equal to the key stored in v’s parent.
- Complete Binary Tree Property (structural): A binary tree H is complete if each level but the last is full and, in level h-1, all of the internal nodes are to the left of the external nodes.
- In a complete binary tree of height h, levels i=0,1,…,h-1 will each have 2i nodes, the maximum possible number of nodes for those levels.
- An example
Height of a Heap
- Proposition: A heap H storing n keys has height
- Justification: Due to H being complete, we know:
- The number of internal nodes n is at least
- The number of internal nodes n is at most
- Therefore,
- or
- which in turn implies:
Height of a Heap (cont.)
- Let’s look at that graphically:
- Consider this heap, which has height h = 4 and n = 13.
- Suppose that two more nodes are added. To maintain completeness of the tree, the two external nodes in level 4 will become internal nodes, i.e., n = 15, h = 4 = log(15 + 1).
- Add one more node: now .
public class HeapSimplePriorityQueue implements SimplePriorityQueue {
BinaryTree T;
Position last;
Comparator comparator;
…
}
Implementation of a Heap (cont.)
- Two ways to find the insertion position z in a heap:
- Special case where w is the right-most node in its level.
- General case
Finding the Insertion Position
- Algorithm findInsertionPosition(w)
Input: Last position w
Output: Insertion position z
z ¬root
else
u ¬last = w
while (u is neither root nor a left child)
u ¬ parent of u
if u is a left child
u ¬ sibling of u
while u is not external
u ¬ left child of u
z ¬u
return z
- Algorithm upHeapBubble(z)
z.setElement(key-element pair to be inserted)
done¬ false
while not done
if z = root
done ¬ true
else
u ¬ parent of z
if key(z) < key(u)
swap key-element pairs between u and z
else
done ¬ true
z ¬ u
Insertion into a Heap
Insertion into a Heap (2)
Insertion into a Heap (3)
Insertion into a Heap (4)
Down-Heap Bubbling
- Algorithm downHeapBubble
done ¬ false
while not done
if both children of r are external
done¬ true
else if right child of r is external
s ¬ left child of r
else
s ¬ child of r with minimum key
if key(s) < key(r)
swap key-element pairs between r and s
r¬s
else
done¬ true
Input w: old last position, about to become insertion position
Output last: new last position
if w = root
return w
while w is neither root nor a right child
w ¬ parent of w
if w is a right child
w ¬ sibling of w
while right child of w is not external
w ¬ right child of w
return w
Removal from a Heap (2)
Removal from a Heap (3)
Removal from a Heap (4)
Method | Unsorted Sequence | Sorted Sequence | Heap |
size, isEmpty | O(1) | O(1) | O(1) |
minElement, minKey | Q(n) | O(1) | O(1) |
insertItem | O(1) | O(n) | O(log n) |
removeMinElement | Q(n) | O(1) | O(log n) |
Height of heap = h = O(log n)
upHeapBubble Î O(h)
= O(log n)
downHeapBubble ÎO(h)
= O(log n)
findInsertionPosition ÎO(2h)
= O(2log n) = O(logn)
findLastPosition ÎO(2h)
= O(2log n) = O(log n)
- All heap methods run in logarithmic time or better.
- If we implement PriorityQueueSort using a heap for out priority queue, insertItem and removeMinElement each take O(log k) where k is the number of elements in the heap at a given time.
- We always have n or fewer elements in the heap, so the worst-case time complexity of these methods is O(log n).
- Thus each phase takes O(n log n) time, so the overall algorithm runs in
- This algorithm is known as heap-sort.
- O(n log n) + O(n log
n)
= O(n log n) time.
- The O(n log n) run-time performance of heap-sort is much better than the O(n2) run-time performance of selection sort and insertion sort.
- If all the keys to be stored are given in advance we can build a heap bottom-up in O(n) time.
- Note: For simplicity we shall describe bottom-up heap construction for the case
- Steps:
- Construct (n+1)/2 elementary heaps composed of one key each.
- Construct (n+1)/4 heaps, each with 3 keys, by joining pairs of elementary heads and adding a new key as the root. The new key may be swapped with a child in order to preserve heap-order property.
- Construct (n+1)/8 heaps, each with 7 keys, by joining pairs of 3-key heaps and adding a new key. Again swaps may occur.
- In the ith step, 2£i£ h, we form (n+1)/2i heaps, each storing 2i-1 keys, by joining pairs of heaps storing (2i-1-1) keys. Swaps may occur.
- Algorithm bottomUpHeap(S)
Input: a sequence S storing 2h-1 keys
Output: a heap H storing the keys in S
return an empty heap
remove the first key, k, from S
Split S into subsequences S1 and S2, each of size (n-1)/2
H1¬ bottomUpHeap(S1)
H2¬ bottomUpHeap(S2)
Create binary tree H such with k at root, H1 at left subtree
and H2 at right subtree
Perform down-heap bubbling from root if necessary
return H
Original sequence in the example (n=15)
14,9,25,16,15,5,4,12,8,11,6,7,27,23,20
Remove first key
14,{9,25,16,15,5,4,12,8,11,6,7,27,23,20}
Enter next level in recursion (n=7)
14,{9,[25,16,15,5,4,12],8,[11,6,7,27,23,20]}
Enter next level in recursion (n=3)
14,{9,[25,(16,15),5,(4,12)],8,[11,(6,7),27,(23,20)]}
Trivial binary trees, n=1
16 15 4 12 6 7 23 20
Bottom-Up Heap Construction (cont.)
Bottom-Up Heap Construction (cont.)
Analysis of Bottom-Up Heap Construction
- Proposition: Bottom-up heap construction with n keys takes O(n) time.
- At level h-1 (height 1), we perform (n+1)/2 insertions, each of worst-case cost O(1).
- At level h-2 (height 2), we perform (n+1)/4 insertions, each of worst-case cost O(2), i.e., O(1) for insertion, O(1) for down-heap bubbling one level,
- ::::
- At height h we perform 1 where at most h-1 levels are involved in the down-heap bubbling.
One worst-case realization is when each node when inserted
performs down-heap bubbling from its inorder successor node. (Right, then
repeat left until an external node is reached).
Every edge in the tree is used at most once for an inorder
successor down-heap bubble. This assures that the number of swaps is bounded
above by the number of edges.
Recall that for a tree, n represents the total
number of nodes, i.e. n = ne
+ ni
The number of edges is e = ne
+ ni
–1
But since in a heap, n equals the number of key-element
pairs, i.e.,
we therefore have
and the complexity of bottom-up heap construction is thus
Sequence-based Implementation of a Binary Tree (cont.)
Best case
Worst case
Summary
Implications for heaps:
1. Heap-order property implies that a sequence-based heap
is relatively efficient space-wise (N=O(n)).
2. The algorithms findInsertionPosition and findLastPosition are O(1).
No comments:
Post a Comment