blob: 4654f5ebe2e1e978341bb750e8556cf951de01fa [file] [log] [blame]
Guido van Rossum0a824382002-08-02 16:44:32 +00001"""Heap queue algorithm (a.k.a. priority queue).
2
3Heaps are arrays for which a[k] <= a[2*k+1] and a[k] <= a[2*k+2] for
4all k, counting elements from 0. For the sake of comparison,
5non-existing elements are considered to be infinite. The interesting
6property of a heap is that a[0] is always its smallest element.
7
8Usage:
9
10heap = [] # creates an empty heap
11heappush(heap, item) # pushes a new item on the heap
12item = heappop(heap) # pops the smallest item from the heap
13item = heap[0] # smallest item on the heap without popping it
14
15Our API differs from textbook heap algorithms as follows:
16
17- We use 0-based indexing. This makes the relationship between the
18 index for a node and the indexes for its children slightly less
19 obvious, but is more suitable since Python uses 0-based indexing.
20
21- Our heappop() method returns the smallest item, not the largest.
22
23These two make it possible to view the heap as a regular Python list
24without surprises: heap[0] is the smallest item, and heap.sort()
25maintains the heap invariant!
26"""
27
28__about__ = """Heap queues
29
30[explanation by François Pinard]
31
32Heaps are arrays for which a[k] <= a[2*k+1] and a[k] <= a[2*k+2] for
33all k, counting elements from 0. For the sake of comparison,
34non-existing elements are considered to be infinite. The interesting
35property of a heap is that a[0] is always its smallest element.
36
37The strange invariant above is meant to be an efficient memory
38representation for a tournament. The numbers below are `k', not a[k]:
39
40 0
41
42 1 2
43
44 3 4 5 6
45
46 7 8 9 10 11 12 13 14
47
48 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30
49
50
51In the tree above, each cell `k' is topping `2*k+1' and `2*k+2'. In
52an usual binary tournament we see in sports, each cell is the winner
53over the two cells it tops, and we can trace the winner down the tree
54to see all opponents s/he had. However, in many computer applications
55of such tournaments, we do not need to trace the history of a winner.
56To be more memory efficient, when a winner is promoted, we try to
57replace it by something else at a lower level, and the rule becomes
58that a cell and the two cells it tops contain three different items,
59but the top cell "wins" over the two topped cells.
60
61If this heap invariant is protected at all time, index 0 is clearly
62the overall winner. The simplest algorithmic way to remove it and
63find the "next" winner is to move some loser (let's say cell 30 in the
64diagram above) into the 0 position, and then percolate this new 0 down
65the tree, exchanging values, until the invariant is re-established.
66This is clearly logarithmic on the total number of items in the tree.
67By iterating over all items, you get an O(n ln n) sort.
68
69A nice feature of this sort is that you can efficiently insert new
70items while the sort is going on, provided that the inserted items are
71not "better" than the last 0'th element you extracted. This is
72especially useful in simulation contexts, where the tree holds all
73incoming events, and the "win" condition means the smallest scheduled
74time. When an event schedule other events for execution, they are
75scheduled into the future, so they can easily go into the heap. So, a
76heap is a good structure for implementing schedulers (this is what I
77used for my MIDI sequencer :-).
78
79Various structures for implementing schedulers have been extensively
80studied, and heaps are good for this, as they are reasonably speedy,
81the speed is almost constant, and the worst case is not much different
82than the average case. However, there are other representations which
83are more efficient overall, yet the worst cases might be terrible.
84
85Heaps are also very useful in big disk sorts. You most probably all
86know that a big sort implies producing "runs" (which are pre-sorted
87sequences, which size is usually related to the amount of CPU memory),
88followed by a merging passes for these runs, which merging is often
89very cleverly organised[1]. It is very important that the initial
90sort produces the longest runs possible. Tournaments are a good way
91to that. If, using all the memory available to hold a tournament, you
92replace and percolate items that happen to fit the current run, you'll
93produce runs which are twice the size of the memory for random input,
94and much better for input fuzzily ordered.
95
96Moreover, if you output the 0'th item on disk and get an input which
97may not fit in the current tournament (because the value "wins" over
98the last output value), it cannot fit in the heap, so the size of the
99heap decreases. The freed memory could be cleverly reused immediately
100for progressively building a second heap, which grows at exactly the
101same rate the first heap is melting. When the first heap completely
102vanishes, you switch heaps and start a new run. Clever and quite
103effective!
104
105In a word, heaps are useful memory structures to know. I use them in
106a few applications, and I think it is good to keep a `heap' module
107around. :-)
108
109--------------------
110[1] The disk balancing algorithms which are current, nowadays, are
111more annoying than clever, and this is a consequence of the seeking
112capabilities of the disks. On devices which cannot seek, like big
113tape drives, the story was quite different, and one had to be very
114clever to ensure (far in advance) that each tape movement will be the
115most effective possible (that is, will best participate at
116"progressing" the merge). Some tapes were even able to read
117backwards, and this was also used to avoid the rewinding time.
118Believe me, real good tape sorts were quite spectacular to watch!
119From all times, sorting has always been a Great Art! :-)
120"""
121
122def heappush(heap, item):
123 """Push item onto heap, maintaining the heap invariant."""
124 pos = len(heap)
125 heap.append(None)
126 while pos:
127 parentpos = (pos - 1) / 2
128 parent = heap[parentpos]
129 if item >= parent:
130 break
131 heap[pos] = parent
132 pos = parentpos
133 heap[pos] = item
134
135def heappop(heap):
136 """Pop the smallest item off the heap, maintaining the heap invariant."""
137 endpos = len(heap) - 1
138 if endpos <= 0:
139 return heap.pop()
140 returnitem = heap[0]
141 item = heap.pop()
142 pos = 0
143 while 1:
144 child2pos = (pos + 1) * 2
145 child1pos = child2pos - 1
146 if child2pos < endpos:
147 child1 = heap[child1pos]
148 child2 = heap[child2pos]
149 if item <= child1 and item <= child2:
150 break
151 if child1 < child2:
152 heap[pos] = child1
153 pos = child1pos
154 continue
155 heap[pos] = child2
156 pos = child2pos
157 continue
158 if child1pos < endpos:
159 child1 = heap[child1pos]
160 if child1 < item:
161 heap[pos] = child1
162 pos = child1pos
163 break
164 heap[pos] = item
165 return returnitem
166
167if __name__ == "__main__":
168 # Simple sanity test
169 heap = []
170 data = [1, 3, 5, 7, 9, 2, 4, 6, 8, 0]
171 for item in data:
172 heappush(heap, item)
173 sort = []
174 while heap:
175 sort.append(heappop(heap))
176 print sort