×

Search anything:

Time and Space Complexity of Comb Sort

Binary Tree book by OpenGenus

Open-Source Internship opportunity by OpenGenus for programmers. Apply now.

In this article, we will learn about Time Complexity and Space Complexity of Comb Sort algorithm, with the in-depth mathematical analysis of different cases. Comb Sort is also known as Dobosiewicz Sort.

In short,

  • Time Complexity:-
    • O(n2) for the Worst Case
    • O(n2/2p) for the Average Case, where, p is the number of increments.
    • O(nlog(n)) for the Best Case
  • Space Complexity: O(1)

Index

Comb Sort Algorithm

The comb sort algorithm is an improved version of the bubble sort algorithm. The underlying idea of both the sorting algorithms is same. In the bubble sort method, the elements are compared with the succeeding elements. In the comb sort, the elements are sorted in a certain gap, decreasing the gap after each iteration by dividing the gap factor by the decreasing factor (also called the shrink factor) which is 1.3.

The pseudo-code for comb sort Algorithm is as follows:-

function combsort(array input) is

    gap := input.size
    shrink := 1.3
    sorted := false

    loop while sorted = false
        gap := floor(gap / shrink)
        if gap ≤ 1 then
            gap := 1
            sorted := true 
        end if
        i := 0
        loop while i + gap < input.size
            if input[i] > input[i+gap] then
                swap(input[i], input[i+gap])
                sorted := false
             end if

             i := i + 1
         end loop
     end loop
end function
  • Python Code
    def getNextGap(gap):
    
        gap = (gap * 10)//13
        if gap < 1:
            return 1
        return gap
    
    def combSort(arr):
        n = len(arr)
    
        gap = n
    
        swapped = True
    
        while gap !=1 or swapped == 1:
    
            gap = getNextGap(gap)
    
            swapped = False
    
            for i in range(0, n-gap):
                if arr[i] > arr[i + gap]:
                    arr[i], arr[i + gap]=arr[i + gap], arr[i]
                    swapped = True  
        return arr
        
    array = [int(i) for i in input().split()]
    print(*combSort(array))  
    
  • C++ Code
    #include <bits/stdc++.h>
    using namespace std;
    
    int getNextGap(int gap) {
        gap = (10 * gap)/13;
        if (gap < 1)
            return 1;
        return gap;
    }
    
    int* combSort(int a[], int n) {
        int gap = n;
        bool swapped = true;
        while (swapped==true or gap!=1) {
            gap = getNextGap(gap);
            swapped = false;
            for (int i=0; i<n-gap; i++) {
                if (a[i] > a[i+gap]) {
                    swap(a[i], a[i+gap]);
                    swapped = true;
                }
            }
        }
        return a;
    }
    
    int main() {
        int n;
        cin >> n;
        int arr[n];
        for (int i=0; i<n; i++) cin >> arr[i];
    
        int* sortedArr = combSort(arr, n);
        for (int i=0; i<n; i++) {
            cout << sortedArr[i] << ' ';
        }
        return 0;
    }  
    

Analysing the Time Complexity

The algorithm traverses 0th to (n-gap)th elements at each phase. The sequence of number of elements traversed is given as ai at ith phase,

ai = n - floor( 10 * gapi / 13 ) --> (1)

, where the gap is the gap at ith phase.

gapi = gapi-1 - floor( 10 * gapi-1 / 13 ) --> (2)

This implies that at ith phase, the algorithm traverses ai elements.

Worst Case
The Worst case configuration is when all the elements are already or nearly sorted but in reverse order. Calculating the upper bound of ai from the equation (1) and recurrence relation (2) using a mathematical software for a worst case, we get a expression containing second and first degree terms of n with a constant giving the upper bound of n2. Thus, the time complexity of a worst case is,

O(n2)

Average Case
The average case configuration is when a random order of elements are in the array. This configuration reduces the steps as some elements which are already sorted will not toggle the swapped variable's bool value and swap the numbers. Let the number of increments of phases be p and X be the sum of all the elements. Thus, X = ∑ai. Estimating asymptotically,
np + (n−1)log((X+n-1)/(n-1)) + O(n) ≥ nlog(n) − Θ(n),
Therefore,
X ≥ (n2/(2p)) Θ(1) = Ω(n2/(2p))
Hence, the time complexity of an average case is,

O(n2/2p)

Best Case
The best configuration occurs when all the elements are already sorted or nearly sorted. In this case, the loop with gap=1 will be run only once (as the others).
Sequences:-
an = n
an-1 = n/1.3
an-2 = n/(1.3)2
an-3 = n/(1.3)3
an-4 = n/(1.3)4
. . .
an-i = n/(1.3)i

Sn = a1 + a2 + ... + an

Sn = r=1∑n (1/(1.3)r)

Generalized Harmonic number is given as Hn,r.

Hn,r = r=1∑n (1/kr)

For,

n ≈ 1
Hn,1 ∈ O(log(n))

Since, we take k=1.3 approximately close to 1, the generalized harmonic number Hn,r

Θ ≈ O(nlog(n))

Therefore, the time complexity of the best case is,

O(nlog(n))

Analysing the space

Comb sort algorithm does not use any variable sized data structure or buffer nor make use of any recursive call, therefore, it takes O(1) space in all the cases.

Conclusion

The Comb sort algorithm is an improved version of bubble sort algorithm, which decreases the gap with a factor of 1.3 after every comparison. This algorithm takes time complexity of O(n2) for the worst case, O(n2/p2) for the average case, and O(nlog(n)) for the best case, with constant space complexity of O(1). Comb sort is better that bubble sort as it removes more than one inversion counts with one swap.

With this article at OpenGenus, you must have the complete idea of Time and Space Complexity of Comb Sort.

Time and Space Complexity of Comb Sort
Share this