×

Search anything:

Time and Space Complexity of Heap data structure operations

Binary Tree book by OpenGenus

Open-Source Internship opportunity by OpenGenus for programmers. Apply now.

In this article, we have explored the Time and Space Complexity of Heap data structure operations including different cases like Worst, Average and Best case. At the end, we have added a table summarizes the complexities.

List of Operations

Prerequisites:

Note:
... 1. All listed operations show and compare both Min Heap and Max Heap.
... 2. Space Complexity for all listed Operations will remain O(1) and if isn't it will be mentioned.
... 3. Every logN you see here is log2N, because, In Heap number of nodes after every level increases with the power of 2.

Insertion

In Heap when we insert an element, we add it to the very end of the heap and then we check and shift the inserted element upwards until the heap is satisfied.

Insertion Best Case O(1)

The best Case for inserting an element in heap would occur according to what kind of heap we have. In case if the Heap is MinHeap and the added value is bigger than its parent's value then it would not require any further checks or shifts.
And for MaxHeap the inserted value has to be less than its parent.

And since, the operations performed here are limited to one check and no shift the time complexity for this case becomes O(1).

Insertion Worst Case O(logN)

Now, the worst case will occur if the inserted value had to be shifted until we reached the root, which means the value is greatest or smallest for MaxHeap and MinHeap respectively.

And since, to reach the root we will be going to go through one parent on each level and since, there are logN levels.
Therefore, the time complexity for the worst case will be O(logN).

Insertion Average Case O(logN)

While inserting in Heap we the best case is if we had to only check once that child and parent satisfy the property and second best case is if we have to shift the child once, and then the number of checks will increase until we reach the root that is logN of checks and shifts.

That is,

1, 2, 3, 4, ..., logN

That's why the average will be,

$$O(ave) = O\bigg( \frac{ 1 + 2 + 3 + ... + logN } {logN} \bigg)$$
$$O(ave) = O\bigg( \frac{(\frac{logN(logN + 1)}{2})}{logN} \bigg)$$
$$O(ave) = O\bigg( \frac{logN + 1}{2} \bigg)$$

Therefore,

$$O(average) = O(logN)$$

Deletion

When deletion is done in a Heap, the element which gets deleted first is the root element and the last element in the Heap takes place of the root element, and then Ofcourse it's checked and downShifted until Heap properties are satisfied.
And the next largest or the smallest value gets placed in the root.

So, first, we do the swap which will take O(1).

And the worst, best and average case depends after the swap that is to swap and bring down the new root value until we find the right place according to if it's MinHeap or MaxHeap.

Deletion Best Case O(1)

To get the best case in this is to have a case where the value you put in the root is already the largest value for MaxHeap and the smallest for MinHeap.

So, there will be no shifting, therefore the best case time complexity will be O(1).

Deletion Worst Case O(logN)

And now, since we're downShifting therefore the worst case will be to downshift till the bottom of the Heap.
And since we know that the number of levels in the Heap is logN then, we will require to check and shift only a maximum of logN times.
So, the time complexity for worst-case deletion will be O(logN).

Deletion Average Case O(logN)

Here complexity will start from the best case of 1 and will increase with 1, that is 1, 2, 3, ... till the max time complexity of logN.

which means,

1 + 2 + 3 + 4 + ... + logN

That's why the average time complexity for deletion will be,

$$O(ave) = O\bigg( \frac{1 + 2 + 3 + ... + logN } {logN} \bigg)$$
$$O(ave) = O\bigg( \frac{(\frac{logN(logN + 1}{2})} {logN} \bigg)$$
$$O(ave) = O\bigg( \frac{logN + 1}{2} \bigg)$$

Therefore,

$$O(average) = O(logN)$$

Searching

In Heap searching is performed linearly, which results in the time complexity of O(N) for all N elements.

And, this is the case for both MinHeap and MaxHeap.

Searching Best Case O(1)

The best case for searching would be to find the element that we're searching at the very first Node of the Heap. And that will be the root Node.

For MinHeap, we will find the smallest value at the root, and for MaxHeap, we will find the Largest value at the Root of the Heap.

And since we had to look at only one Node to find our element, that's why the time complexity will be O(1).

Searching Worst Case O(N)

When searching linearly, the worst case would occur only when we find the element at the very end of the search, meaning at the last Node of the Heap.
And since theirs N number of elements, that's why the resulting time complexity for the worst-case search will be _O(N).

Searching Average Case O(N)

Since searching is happening linearly, therefore, if we take the whole sequence of possible cases, then it starts with 1 and then continues to increase by one, that is 1, 2, 3, 4, ... until we try to find the worse case that is Nth element.

So, the number case here is N.

Therefore, the Complexity will be,

$$O(ave) = O\bigg(\frac{1 + 2 + 3 + 4 + ... + N}{N}\bigg)$$
$$O(ave) = O\bigg( \frac{(\frac{N( N + 1 )}{2})}{N} \bigg)$$
$$O(ave) = O\bigg( \frac{N + 1}{2} \bigg)$$

And since, N is the highest order here, therefore, the average case time complexity will be,

$$O(N)$$

Note:
If you have a sorted Heap, then you should be able to perform a binary search on the heap, but since we're just looking at the tree structure of the Heap, therefore, we're not analysing that case here.

Getting Max Value & Min Value

The Best and trivial case can be to ask for max value in MaxHeap and min value in MinHeap, and which will require O(1) of time complexity, in both cases of MinHeap and MaxHeap, since, one can directly return the root value.

But, if we try to find the largest value in MinHeap and the smallest value in MaxHeap, then it will take us a linear approach to find the number.

The best case to find the element in the linear search would only occur when the size of the Heap is 1 and smallest value = largest value.
Or else it will require to search the whole Heap causing it to perform the search of O(N).

Sorting

Sorting in a Heap is done by continuously performing the Delete operation for all the available elements, and there are actually, in total N number of elements, The deleted element gets stored at the end of the array, in the memory that we are not using anymore for the Heap.
Completing this whole procedure produces a sorted array.

Since, the complexity of deleting an element from any Heap is O(logN), therefore, the time complexity for sorting goes to N times of O(logN).
That is,

$$O(NlogN)$$

Want Best Case while sorting heap? Well then maybe have a Heap of length 1, then you will O(1) complexity. LOL.

Creating a Heap

To create a Heap from some array with the N numbers of element in it, we would require to insert every single element into the Heap, And as we have seen that the inserting of an element in heap, takes O(logN) for both worst and average cases.

And since, inserting for all elements will go from bottom to up, therefore we will be considering the worst-case time complexity O(logN) and now since, we are performing insertion on N number of elements then the whole time complexity will be,

$$O( NlogN )$$

The time complexity of O(N) can occur here, But only in case when the given array is sorted, in either ascending or descending order, but if we have MaxHeap then descending one will create the best-case for the insertion of the all elements from the array and vice versa.

Heapify

Unlike deletion in Heap, when we heapify and when we use downShift for every Node except the leaves, We do not always downShift from the root. This provides us an advantage, on the time complexity.

How? So, let's calculate what downShift on all nodes will result to.

Considering leaf node's level as zero and root node's level as logN.

The maximum complexity for down shift on level zero will be O(0) as they have no child nodes.

And then maximum complexity for down shift on level one will be O(1) as they have only one level where they can down shift to.

And as we go above in the heap, this is going to increase by 1, until we reach the root, where maximum complexity for down shift will be O(log(N)).
So, it would be progressing something like,

O(1), O(2), O(3), O(4), ... O(logN)

Now, for every level we go up, the number of nodes will decrease with the power of 2, And the number of nodes on level 0 will count to N/2 and on level 1 we will have N/4 number of nodes.
So, the number of times we will require to run check and swap on whole Heap will be,

$$O(0)\frac{N}{2} + O(1)\frac{N}{4} + O(2)\frac{N}{8} + ... + O(logN)\frac{N}{2^{logN}}$$

And since, in denominator we have $2^{level}$, so, let's reformat

  • also removing 0th level.
  • also taking out N as common.

$$O\biggl( N\bigg(\frac{1}{2^1} + \frac{2}{2^2} + \frac{3}{2^3} + \frac{4}{2^4} + ... + \frac{logN}{2^{logN}}\bigg) \biggl)$$

In here, you can see Numerator is increasing by 1 and Denominator is increasing by the power of 2 and if we take a look this looks like,

  • $\frac{1}{2} = 0.5$
  • $\frac{2}{4} = 0.5$
  • $\frac{3}{8} = 0.375$
  • $\frac{4}{16} = 0.25$
  • $\frac{5}{32} = 0.15625$

Because of this, the whole $\frac{1}{2^1} + \frac{2}{2^2} + \frac{3}{2^3} + \frac{4}{2^4} + ... + \frac{logN}{2^{logN}}$ is not going anywhere near to N or logN, and since, we're looking for Asymptotic result therefore we can just ignore the whole thing as a constant.

Which gives us,

$$O(N)$$

Summary

Here's the list of all Time Complexity and Space Complexity we looked at,

Operation Time Complexity Space Complexity
Insertion Best Case: O(1) O(1)
Worst Case: O(logN)
Average Case: O(logN)
Deletion Best Case: O(1) O(1)
Worst Case: O(logN)
Average Case: O(logN)
Searching Best Case: O(1) O(1)
Worst Case: O(N)
Average Case: O(N)
Max Value In MaxHeap: O(1) O(1)
In MinHeap: O(N)
Min Value In MinHeap: O(1) O(1)
In MaxHeap: O(N)
Sorting All Cases: O(NlogN) O(1)
Creating a Heap By Inserting all elements:O(NlogN)O(N)
Using HeapifyO(N)O(1)
Time and Space Complexity of Heap data structure operations
Share this