Heap, Quick and Merge sort has the time complexity O(n log n). In these 3 efficient sorting mechanisms heap is a special kind of tree based data structure. There are many kinds of heap algorithms like binary heap, binomial heap, pairing heap etc.
Here we experiment with the binary heap tree mechanism.
Binary heap is a heap created using binary tree. Binary heap has some additional property than binary tree. All levels of the tree is fully filled. Incase if the last level is not complete, the nodes of that level are filled from left to right. Each node should be greater than or equal to each of its children.
Heapsort primarily competes with quicksort, another very efficient general purpose nearly-in-place comparison-based sort algorithm. Quicksort is typically somewhat faster, due to better cache behavior and other factors, but the worst-case running time for quicksort is O(n2), which is unacceptable for large data sets.
Other than sorting two main operations performed on the heap are insertion and removal of elements. It must strictly follow the above rules even we add/remove any kind of values.
During insertion we add the new value as the child of the bottom most completed node. If there is an uncompleted bottom node, then we add the new value as the right node of that and complete the node. If the new element is higher value than its parent, the rules of binary heap are violated. Then we must check the new element with its parent and switches if parent is smaller. This process continues till the new element gets right placement. This is called siftUp.
During deletion also it must carefully done in order to maintain the tree structure. The procedure for deleting the root from the heap and restoring the properties is called down-heap. First, replace the root of the heap with the last element on the last level. Then compare the new root with its children. If they are not in correct order, swap the element with one of its children and return to the previous step.
The "siftUp" version given below has O(n log n) time complexity due to its equivalence with inserting each element, one at a time, into an empty heap. The number of swaps that may occur during any one siftUp call increases with the depth of the node on which the call is made. The crux is that there are many (exponentially many) more "deep" nodes than there are "shallow" nodes in a heap, so that siftUp may have its full logarithmic running-time on the approximately linear number of calls made on the nodes at or near the "bottom" of the heap.
So this type of heap is more efficient and its time complexity is O(n log n).
Click here to view the animated image of heap sort.