Get this book > Problems on Array: For Interviews and Competitive Programming
In this article, we will learn about Time Complexity and Space Complexity of Comb Sort algorithm, with the indepth mathematical analysis of different cases. Comb Sort is also known as Dobosiewicz Sort.
In short,
 Time Complexity:
 O(n^{2}) for the Worst Case
 O(n^{2}/2^{p}) for the Average Case, where, p is the number of increments.
 O(nlog(n)) for the Best Case
 Space Complexity: O(1)
Index
 Comb Sort Algorithm in brief
 Analysing the Time Complexity for
 Analysing the Space Complexity
 Conclusion
Comb Sort Algorithm
The comb sort algorithm is an improved version of the bubble sort algorithm. The underlying idea of both the sorting algorithms is same. In the bubble sort method, the elements are compared with the succeeding elements. In the comb sort, the elements are sorted in a certain gap, decreasing the gap after each iteration by dividing the gap factor by the decreasing factor (also called the shrink factor) which is 1.3.
The pseudocode for comb sort Algorithm is as follows:
function combsort(array input) is
gap := input.size
shrink := 1.3
sorted := false
loop while sorted = false
gap := floor(gap / shrink)
if gap β€ 1 then
gap := 1
sorted := true
end if
i := 0
loop while i + gap < input.size
if input[i] > input[i+gap] then
swap(input[i], input[i+gap])
sorted := false
end if
i := i + 1
end loop
end loop
end function

Python Code
def getNextGap(gap): gap = (gap * 10)//13 if gap < 1: return 1 return gap def combSort(arr): n = len(arr) gap = n swapped = True while gap !=1 or swapped == 1: gap = getNextGap(gap) swapped = False for i in range(0, ngap): if arr[i] > arr[i + gap]: arr[i], arr[i + gap]=arr[i + gap], arr[i] swapped = True return arr array = [int(i) for i in input().split()] print(*combSort(array))

C++ Code
#include <bits/stdc++.h> using namespace std; int getNextGap(int gap) { gap = (10 * gap)/13; if (gap < 1) return 1; return gap; } int* combSort(int a[], int n) { int gap = n; bool swapped = true; while (swapped==true or gap!=1) { gap = getNextGap(gap); swapped = false; for (int i=0; i<ngap; i++) { if (a[i] > a[i+gap]) { swap(a[i], a[i+gap]); swapped = true; } } } return a; } int main() { int n; cin >> n; int arr[n]; for (int i=0; i<n; i++) cin >> arr[i]; int* sortedArr = combSort(arr, n); for (int i=0; i<n; i++) { cout << sortedArr[i] << ' '; } return 0; }
Analysing the Time Complexity
The algorithm traverses 0^{th} to (ngap)^{th} elements at each phase. The sequence of number of elements traversed is given as a_{i} at i^{th} phase,
a_{i} = n  floor( 10 * gap_{i} / 13 ) > (1)
, where the gap is the gap at i^{th} phase.
gap_{i} = gap_{i1}  floor( 10 * gap_{i1} / 13 ) > (2)
This implies that at i^{th} phase, the algorithm traverses a_{i} elements.
Worst Case
The Worst case configuration is when all the elements are already or nearly sorted but in reverse order. Calculating the upper bound of a_{i} from the equation (1) and recurrence relation (2) using a mathematical software for a worst case, we get a expression containing second and first degree terms of n with a constant giving the upper bound of n^{2}. Thus, the time complexity of a worst case is,
O(n^{2})
Average Case
The average case configuration is when a random order of elements are in the array. This configuration reduces the steps as some elements which are already sorted will not toggle the swapped variable's bool value and swap the numbers. Let the number of increments of phases be p and X be the sum of all the elements. Thus, X = βa_{i}. Estimating asymptotically,
np + (nβ1)log((X+n1)/(n1)) + O(n) β₯ nlog(n) β Ξ(n),
Therefore,
X β₯ (n^{2}/(2^{p})) Ξ(1) = Ξ©(n^{2}/(2^{p}))
Hence, the time complexity of an average case is,
O(n^{2}/2^{p})
Best Case
The best configuration occurs when all the elements are already sorted or nearly sorted. In this case, the loop with gap=1 will be run only once (as the others).
Sequences:
a_{n} = n
a_{n1} = n/1.3
a_{n2} = n/(1.3)^{2}
a_{n3} = n/(1.3)^{3}
a_{n4} = n/(1.3)^{4}
. . .
a_{ni} = n/(1.3)^{i}
S_{n} = a_{1} + a_{2} + ... + a_{n}
S_{n} = _{r=1}β^{n} (1/(1.3)^{r})
Generalized Harmonic number is given as H_{n,r}.
For,
H_{n,1} β O(log(n))
Since, we take k=1.3 approximately close to 1, the generalized harmonic number H_{n,r}
Therefore, the time complexity of the best case is,
O(nlog(n))
Analysing the space
Comb sort algorithm does not use any variable sized data structure or buffer nor make use of any recursive call, therefore, it takes O(1) space in all the cases.
Conclusion
The Comb sort algorithm is an improved version of bubble sort algorithm, which decreases the gap with a factor of 1.3 after every comparison. This algorithm takes time complexity of O(n^{2}) for the worst case, O(n^{2}/p^{2}) for the average case, and O(nlog(n)) for the best case, with constant space complexity of O(1). Comb sort is better that bubble sort as it removes more than one inversion counts with one swap.
With this article at OpenGenus, you must have the complete idea of Time and Space Complexity of Comb Sort.