Time and Space Complexity of Stooge Sort

Internship at OpenGenus

Get this book -> Problems on Array: For Interviews and Competitive Programming

In this article, we will be discussing the time and space complexity of Stooge Sort covering various cases like Worst, Best and Average Case.

Table of contents:

  1. Introduction to Sorting
  2. Introduction to Stooge Sort
  3. Analysis of Stooge Sort
  4. Master Theorem
  5. Time Complexity of Stooge Sort
  6. Space Complexity of Stooge Sort

Pre-requisites:

As a summary, Time Complexity of Stooge Sort:

Sorting Average Case Worst Case Best Case
Stooge Sort O(N2.709) O(N2.709) O(N2.709)

Space Complexity of Stooge Sort:

Sorting Space Complexity
Stooge Sort O(N)

We have deriving the result and explain the different cases further in this article. Before we start discussing Stooge Sort, let us briefly discuss sorting.

Introduction to Sorting

Sorting is the process of arranging elements either in ascending (increasing order) or descending(decreasing) order, this arranging of elements takes place according to some criterion .Some real world examples of sorting include:

  • Sorting test Papers based on Roll Number.
  • Sorting Contact List on a Phone based on Contact Name.
  • Sorting in applications like amazon or flipkart.

Introduction to Stooge Sort

Stooge sort is a recursive sorting algorithm in which we swap the first and last element of the array if the first element is greater than last element, then we stooge sort recursively 3 times , with the first call being to the initial 2/3rd array, then to the last 2/3rd and then again to the initial 2/3rd to confirm.

Algorithm StoogeSort(array , left , right){
    if left >= right then
        end
    if array[left] > array[right] then
        swap(array[left] , array[right])
    
    //Find if array has more than 2 elements
    if right - left + 1 > 2 then
        t = floor( (right - left + 1) / 3)
        StoogeSort(array , left , right - t)
        StoogeSort(array , left + t , right)
        StoogeSort(array , left , right - t)
}

Example:
Consider the array arr = [7 , 5 , 1 , 4 , 0] ,

  • First , we check if the first element (7 in this case) is larger than the last element , if so we swap them. After swapping , we get
    arr = [0 , 5 , 1 , 4 , 7]

  • We Run the algorithm recursively for the initial 2/3rd of the array
    arr = [[0 , 5 , 1 ] , 4 , 7]
    With [0 , 5 , 1] as arr , we continue checking if first element is larger than the last element , in this case it's not so. Further dividing the array is done, on dividing we get arrays of size 2. [0 , 5] , [5 , 1] are the array, in this 5 and 1 are swapped, Now arr = [0 , 1 , 5 ,4 , 7]

  • Same procedure continues until we have a sorted array arr = [0 ,1 , 4, 5 , 7].

Analysis of Stooge Sort

We can formulate a recurrence relationship for the stooge sort as it is a recursive algorithm.

T(N) = 3 * T(N/(2/3)) + O(1) , where N is the size of Array.

On Simplification,

T(N) = 3 * T(3N/2) + O(1) 

We can use Master's Theorem to Solve the following recurrence relation.

Master Theorem

Master Theorem is used to find Asymptotic analysis of recurrence relations that are present in many divide and conquer problems.
Master theorem can be used for solving recurrence relations of the form:

T(N) = aT(N/b) + f(N) 

where a >= 1 and b > 1
let f(N) = cNk
let T(1) = c,
then for a >= 1 , b > 1 and k >= 0 , we have 3 solutions for master theorem

  • when a < bk, the solution is Θ(Nk)
  • when a = bk, the solution is Θ(Nk log N)
  • when a > bk, the solution is Θ(Nlogba)

From the above relation , the values of a, b and k are :
a = 3 , b = 3/2 , k = 0.
3 > (1.5)0 , thus it the time complexity is O(Nlogba)

On Substituting a , b values we get O(Nlog1.53)
simplifying log , we further get O(N2.709).

Time Complexity of Stooge Sort

Stooge sort is independent of input data as it justs checks if first element is greater than the last element, It will check for every array position as long as the array size is 3 or greater. Even if the array is sorted stooge sort will run on it, hence it's time complexity remains same for every iteration of stooge sort.

  • Worst Case Scenario : O(N2.709) , it occurs when the array is already sorted yet we waste the time trying to sort it again, no swaps takes place only the recursive calls happen.
  • Best Case Scenario : O(N2.709), it occurs when we have a unsorted array , the end result is a sorted array , the time take it still the same as worst case scenario but in this case we do end up with a sorted array.
  • Average Case Scenario : O(N2.709)

Space Complexity of Stooge Sort

At any given moment , there are at most N recursive calls present on the stack, which leads us to our space complexity of O(N), No Additional arrays and variables are used to store data temporarily , thus the final space Space Complexity is O(N).

Conclusion

As a summary, Time Complexity of Stooge Sort:

Sorting Average Case Worst Case Best Case
Stooge Sort O(N2.709) O(N2.709) O(N2.709)

Space Complexity of Stooge Sort:

Sorting Space Complexity
Stooge Sort O(N)

Inefficient , but simple to understand, Stooge sort is a recursive sorting algorithm known for its bad time complexity of O(N2.709) . It is notably slower than the most basic sorting algorithm, which is the Bubble sort having time complexity of O(N2).