Why is a Fibonacci heap called a Fibonacci heap?
The Fibonacci heap is made up of a collection of smaller heap-ordered trees of different "orders" that obey certain structural constraints. The Fibonacci sequence arises because these trees are constructed in a way such that a tree of order n has at least Fn+2 nodes in it, where Fn+2 is the (n + 2)nd Fibonacci number.
To see why this result is true, let's begin by seeing how the trees in the Fibonacci heap are constructed. Initially, whenever a node is put into a Fibonacci heap, it is put into a tree of order 0 that contains just that node. Whenever a value is removed from the Fibonacci heap, some of the trees in the Fibonacci heap are coalesced together such that the number of trees doesn't grow too large.
When combining trees together, the Fibonacci heap only combines together trees of the same order. To combine two trees of order n into a tree of order n + 1, the Fibonacci heap takes whichever of the two trees has a greater root value than the other, then makes that tree a child of the other tree. One consequence of this fact is that trees of order n always have exactly n children.
The main attraction of the Fibonacci heap is that it supports the decrease-key efficiently (in amortized O(1)). In order to support this, the Fibonacci heap implements decrease-key as follows: to decrease the key of a value stored in some node, that node is cut from its parent tree and treated as its own separate tree. When this happens, the order of its old parent node is decreased by one. For example, if an order 4 tree has a child cut from it, it shrinks to an order 3 tree, which makes sense because the order of a tree is supposed to be the number of children it contains.
The problem with doing this is that if too many trees get cut off from the same tree, we might have a tree with a large order but which contains a very small number of nodes. The time guarantees of a Fibonacci heap are only possible if trees with large orders contain a huge number of nodes, and if we can just cut any nodes we'd like from trees we could easily get into a situation where a tree with a huge order only contains a small number of nodes.
To address this, Fibonacci heaps make one requirement - if you cut two children from a tree, you have to in turn cut that tree from its parent. This means that the trees that form a Fibonacci heap won't be too badly damaged by decrease-key.
And now we can get to the part about Fibonacci numbers. At this point, we can say the following about the trees in a Fibonacci heap:
- A tree of order n has exactly n children.
- Trees of order n are formed by taking two trees of order n - 1 and making one the child of another.
- If a tree loses two children, that tree is cut away from its parent.
So now we can ask - what are the smallest possible trees that you can make under these assumptions?
Let's try out some examples. There is only one possible tree of order 0, which is a just a single node:
Smallest possible order 0 tree: *
The smallest possible tree of order 1 would have to be at least a node with a child. The smallest possible child we could make is a single node with the smallest order-0 tree as a child, which is this tree:
Smallest possible order 1 tree: *
|
*
What about the smallest tree of order 2? This is where things get interesting. This tree certainly has to have two children, and it would be formed by merging together two trees of order 1. Consequently, the tree would initially have two children - a tree of order 0 and a tree of order 1. But remember - we can cut away children from trees after merging them! In this case, if we cut away the child of the tree of order 1, we would be left with a tree with two children, both of which are trees of order 0:
Smallest possible order 2 tree: *
/ \
* *
How about order 3? As before, this tree would be made by merging together two trees of order 2. We would then try to cut away as much of the subtrees of this order-3 tree as possible. When it's created, the tree has subtrees of orders 2, 1, and 0. We can't cut away from the order 0 tree, but we can cut a single child from the order 2 and order 1 tree. If we do this, we're left with a tree with three children, one of order 1, and two of order 2:
Smallest possible order 3 tree: *
/|\
* * *
|
*
Now we can spot a pattern. The smallest possible order-(n + 2) tree would be formed as follows: start by creating a normal order (n + 2) tree, which has children of orders n + 1, n, n - 1, ..., 2, 1, 0. Then, make those trees as small as possible by cutting away nodes from them without cutting two children from the same node. This leaves a tree with children of orders n, n - 2, ..., 1, 0, and 0.
We can now write a recurrence relation to try to determine how many nodes are in these trees. If we do this, we get the following, where NC(n) represents the smallest number of nodes that could be in a tree of order n:
NC(0) = 1
NC(1) = 2
NC(n + 2) = NC(n) + NC(n - 1) + ... + NC(1) + NC(0) + NC(0) + 1
Here, the final +1 accounts for the root node itself.
If we expand out these terms, we get the following:
NC(0) = 1
NC(1) = 2
NC(2) = NC(0) + NC(0) + 1 = 3
NC(3) = NC(1) + NC(0) + NC(0) + 1 = 5
NC(4) = NC(2) + NC(1) + NC(0) + NC(0) + 1 = 8
If you'll notice, this is exactly the Fibonacci series offset by two positions. In other words, each of these trees has to have at least Fn+2 nodes in them, where Fn + 2 is the (n + 2)nd Fibonacci number.
So why is it called a Fibonacci heap? Because each tree of order n has to have at least Fn+2 nodes in it!
If you're curious, the original paper on Fibonacci heaps has pictures of these smallest possible trees. It's pretty nifty to see! Also, check out this CS Theory Stack Exchange Post for an alternative explanation as to why Fibonacci heap trees have the sizes they do.
Hope this helps!
I want to give an intuitive explanation that I myself had an "Aha!" moment with.
Tree structures achieve O(log n) runtimes because they are able to store exponential number of items in terms of their heights. Binary trees can store 2^h, tri-nary trees can store 3^h and so on for x^h as generic case.
Can x be less than 2? Sure it can! As long as x > 1, we still achieve log runtimes and ability to store exponentially large number of items for its height. But how do we build such a tree? Fibonacci heap is a data structure where x ≈ 1.62, or Golden Ratio. Whenever we encounter Golden Ratio, there are Fibonacci numbers lurking somewhere...
Fibonacci heap is actually a forest of trees. After a process called "consolidation", each of these trees hold a distinct count of items that are exact powers of 2. For example, if your Fibonacci heap has 15 items, it would have 4 trees that hold 1, 2, 4 and 8 items respectively looking like this:
Details of "consolidation" process is not relevant but in essence it basically keeps unioning trees in the forest of same degree until all trees have distinct degree (try a cool visualization to see how these trees get built). As you can write any n as sum of exact powers of 2, it's easy to imagine how consolidated trees for Fibonacci heap would look like for any n.
OK, so far we still don't see any direct connection to Fibonacci numbers. Where do they come in picture? They actually appear in very special case and this is also a key to why Fibonacci heap can offer O(1) time for DECREASE-KEY operation. When we decrease a key, if new key is still larger than parent's key then we don't need to do anything else because min-heap property is not violated. But if it isn't then we can't leave smaller child buried under larger parent and so we need to cut it's subtree out and make a new tree out of it. Obviously we can't keep doing this for each delete because otherwise we will end up with trees that are too tall and no longer have exponential items which means no more O(log n) time for extract operation. So the question is what rule can we set up so tree still have exponential items for its height? The clever insight is this:
We will allow each parent to loose up to one child. If there is a subsequent attempt to remove another child from same parent then we will cut that parent also out of that tree and put it in root list as tree of 1.
Above rule makes sure trees still have exponential items for its height even in the worse case. What is the worse case? The worse case occurs when we make each parent loose one child. If parent has > 1 child we have choice to which one to get rid of. In that case let's get rid of child with largest subtree. So in above diagram, if you do that for each parent, you will have trees of the size 1, 1, 2 and 3. See a pattern here? Just for fun, add new tree of order 4 (i.e. 16 items) in above diagram and guess what would you be left with after running this rule for each parent: 5. We have a Fibonacci sequence here!
As Fibonacci sequence is exponential, each tree still retains exponential number of items and thus manages to have O(log n) runtime for EXTRACT-MIN operation. However notice that DECREASE-KEY now takes only O(1). One other cool thing is that you can represent any number as a sum of Fibonacci numbers. For example, 32 = 21 + 8 + 3 which means if you needed to hold 32 items in Fibonacci heap, you can do so using 3 trees holding 21, 8 and 3 items respectively. However "consolidation" process does not produces trees with Fibonacci numbers of nodes. They only occur when this worse case of DECREASE-KEY or DELETE happens.
Further reading
- Binomial Heap - Fibonacci heaps are essentially lazy Binomial heaps. It's a cool data structure because it shows a different way of storing exponential items in a tree for its height other than binary heap.
- Intuition behind Fibonacci Heaps Required reading for anyone who wants to understand Fibonacci heaps at its core. Textbooks often are either too shallow and too cluttered on this subject but the author of this SO answer has nailed it hands down.