Open-Source Internship opportunity by OpenGenus for programmers. Apply now.
In this article, we will learn about the space and time complexity of the Merge sort algorithm on Linked List using Mathematical analysis of various cases.
Table of Contents :
- Merge Sort algorithm on Linked List
- Time Complexity Analysis of Merge Sort on Linked List
- Worst Case Time Complexity Analysis
- Average Case Time Complexity Analysis
- Best Case Time Complexity Analysis
- Space Complexity Analysis
- Comparison between Merge sort on array and linked list
- FAQs on Merge Sort algorithm
Merge Sort Algorithm on Linked List
Merge sort is the most preferred algorithm for sorting a linked list. It works on the principle of Divide and Conquer technique. The key idea of Merge sort algorithm is to first divide a given data structure into two halves, sort the two halves individually and then merge them together.
Algorithm for sorting Linked list using Merge Sort
Step 1 : START
Step 2 : If the head is null or the linked list contains only one elements then return
Step 3 : Now divide the given linked list into two halves i.e. left and right. Let left and right denote the the left and right parts of the given linked list respectively.
Step 4 : Now sort the two halves individually
mergeSort(left)
mergeSort(right)
Step 5 : Now simply merge the two halves together and update the head pointer
Java implementation of Merge Sort on Linked List
// Java program for the implementing merge sort on Linked List
import java.lang.*;
// Class for nodes of the linked list
class LinkedNode {
int val;
LinkedNode next;
LinkedNode(int key)
{
this.val = key;
next = null;
}
}
public class Opengenus_linkedList_MergeSort {
// Function to implement the merge sort algorithm
public static LinkedNode mergeSort(LinkedNode head)
{
if (head.next == null)
return head;
LinkedNode mid = findMid(head);
LinkedNode head2 = mid.next;
mid.next = null;
LinkedNode newHead1 = mergeSort(head);
LinkedNode newHead2 = mergeSort(head2);
LinkedNode finalHead = merge(newHead1, newHead2);
return finalHead;
}
public static LinkedNode merge(LinkedNode head1, LinkedNode head2)
{
LinkedNode merged = new LinkedNode(-1);
LinkedNode temp = merged;
// While head1 is not null and head2
// is not null
while (head1 != null && head2 != null) {
if (head1.val < head2.val) {
temp.next = head1;
head1 = head1.next;
}
else {
temp.next = head2;
head2 = head2.next;
}
temp = temp.next;
}
// While head1 is not null
while (head1 != null) {
temp.next = head1;
head1 = head1.next;
temp = temp.next;
}
// While head2 is not null
while (head2 != null) {
temp.next = head2;
head2 = head2.next;
temp = temp.next;
}
return merged.next;
}
// Finding the middle node using the slow pointer approach
public static LinkedNode findMid(LinkedNode head)
{
LinkedNode slow = head, fast = head.next;
while (fast != null && fast.next != null) {
slow = slow.next;
fast = fast.next.next;
}
return slow;
}
// Function to print list
static void printList(LinkedNode head)
{
while (head != null) {
System.out.print(head.val + " ");
head = head.next;
}
}
// Driver Code
public static void main(String[] args)
{
LinkedNode head = new LinkedNode(7);
LinkedNode temp = head;
temp.next = new LinkedNode(10);
temp = temp.next;
temp.next = new LinkedNode(20);
temp = temp.next;
temp.next = new LinkedNode(05);
temp = temp.next;
temp.next = new LinkedNode(100);
temp = temp.next;
temp.next = new LinkedNode(90);
temp = temp.next;
// Applying the Merge sort on the linked list created
head = mergeSort(head);
System.out.print("Sorted Linked List : ");
printList(head);
}
}
Output
Sorted Linked List : 5 7 10 20 90 100
Time Complexity Analysis of Merge Sort on Linked List
We know that the recurrence relation for the merge sort algorithm can be interpreted as :
T(n) = 2T(n/2) + theta(n)
Well theoretically speaking, time complexity for three cases will be :
- Worst Case : theta(n * log n)
- Average Case : theta(n * log n)
- Worst Case : theta(n * log n)
The time complexity of Merge sort in Worst, Average and Best case is theta(n*logn) because the merge sort always divides the given list into two halves regardless of the present state of the list and also takes linear time for merging the list together.
Let us try solving the above given recurrence relation using the Master's theorem :
Comparing the given recurrence relation with T(n) = aT(n/b) + f(n) we get a = 2, b = 2 and f(n) = theta(n). Now log of a to base b = 1, since f(n) = theta(n) therefore the required time complexity of the given recurrence relation will be : T(n) = theta(n^c * log n) where c = log of a to base b = 1, hence the time complexity will be T(n) = theta(n^1 * log n).
Recursion tree for the given recurrence relation will be as follows :
T(n)
/ \
T(n/2) T(n/2)
/ \ / \
T(n/4) T(n/4)T(n/4) T(n/4)
| | | |
| | | |
| | | |
Let's take a look at the depth of the time complexity calculated :
- First the findMid function has a complexity of O(N) because we are basically iterating over each node in the linked list using the Hare and Tortoise method.
- Second the merge function has complexity of O(N + M) because there are two lists of size N and M each, then we are simply iterating and merging them.
- Finally we have the mergeSort function that has complexity O(Nlog N) because in every call, we merge two lists with the complexity of O(N) and the depth of recursive tree will be log N because in each call we split our list into two halves, where N is the length of the linked list given.
Hence the total time complexity is O(Nlog N)
Worst Case Time Complexity Analysis
The worst case of merge sort can occur in any of the following cases :
-
Merge sort has to do maximum number of comparisons.
-
The left and right subparts in each merge operation have alternate elements.
-
The two largest value in a merge step are contained in opposite lists.
-
During every merge step, exactly one value remains in the opposite list.
Applying Merge Sort using Divide and ConquerInput linked list [5,1,7,3,6,2,8,4] / \ / \ [5,1,7,3] and [6,2,8,4] / \ / \ / \ / \ [5,1] [7,3] [6,2] [8,4] Every pair of 2 will be compared atleast once therefore maximum comparison here | | | | | | | | [1,5] [3,7] [2,6] [4,8] Maximum Comparison:Every pair of set is used in comparison \ / \ / \ / \ / [1,3,5,7] [2,4,6,8] Maximum comparison again: Every pair of set compared \ / \ / [1,2,3,4,5,6,7,8]
Number of total comparisons = n * log n - (n - 1)
The complexity of worst case merge sort is :
T(n) = 2T(n/2) + n - 1 ---------- (1)
Substituting value of T(n/2) in equation 1
T(n) = 2[2T(n/4) + n/2 - 1] + n - 1 -------------- (2)
Similarly we go on substituting (n/2^k) values in the subsequent equations. The general equation that we get is :
T(n) = 2^k * T(n / 2^k) + k*n - (2^k - 1) ------------ (7)
Now,
T(1) = 0 --------------- (3)
2^k = n ------------------ (4)
=> k = log(2)n --------------- (5)
On putting 'k' in the above equation we get,
T(n) = n* log(2)n - n + 1 ------------------- (6)
Now substituting (3) through (5) into (7), we eliminate 'k'. Hence the worst case time complexity of merge sort is O(n*log n).
Average Case Time Complexity Analysis
For average case we are going to use the recurrence relation that we used above in the time complexity analysis section.
T(n) = 2T(n/2) + theta(n)
We know that linked list of size n can be divided into a maximum of log n parts, sorting and merging each part takes O(n) time hence the average case time complexity = theta(n*log n)
Best Case Time Complexity Analysis
Merge sort’s best case is when the largest element of one sorted
sub-list is smaller than the first element of its opposing sub-list,
for every merge step that occurs. Only one element from the
opposing list is compared, which reduces the number of
comparisons in each merge step to N/2.
Using the following equation :
T(n) = 2T(n/2) + n/2 ----------------------- (1.1)
T(n) = 2[2T(n/4) + n/4] + n/2 -------------- (1.2)
T(n) = 4[2T(n/8) + n/8] + n ---------------- (1.3)
T(n) = 8T(n/8) + 3*n / 2 ------------------- (1.4)
The general equation we get :
T(n) = 2^k * T(n / 2^k) + k*n / 2
On eliminating 'k' we get : T(n) = (n/2)log(2)n
Hence the best case time complexity of merge sort on linked list is O(nlog n)
Space Complexity Analysis
The Space Complexity of Merge Sort is O(log N) where N is the number of nodes in the linked list.
This is because Merge sort algorithm is recursive, it requires O(log N) stack space for linked list cases.
In case of arrays, there is allocation of additional O(N) space, hence for arrays the space complexity is O(N), for linked lists its O(log N).
Comparison between Merge sort on array and linked list
In this section we are going take a look as to how Merge sort on linked list differs from merge sort on array?
We know that no other soring algorithm performs better than Merge sort on linked list. The reasons are :
- Nodes of the linked list may not be present at adjacent positions, hence for such cases merge sort is the best.
- We can implement merge operation in the merge sort without any extra space.
- Random access of nodes is not possible in linked list.
- Merge sort accesses the data sequentially hence the need of random access is quite low.
- Merge sort is highly preferred for stable sorting. Now this is essential for linked list because there is auxiliary data associated with the element.
Let's take a look at the disadvantages of Merge sort on arrays :
- Comparatively slower than other sorting algorithms, especially for smaller datasets.
- Merge sort goes through the entire process even if the array is already sorted.
- It required O(n) space for temporary arrays.
FAQs on Merge Sort algorithm
1. Does Merge sort algorithm require extra space?
Ans : Yes, Merge sort algorithm requires extra space.
2. Which technique does Merge sort algorithm use for sorting?
Ans : Merge sort algorithm uses Divide and Conquer technique for sorting.
3. What is the average time complexity of Merge sort on linked list?
Ans : The average time complexity of merge sort algorithm on linked list is theta(n*log n).
4. Is Merge sort preferred over arrays than linked lists?
Ans : Merge sort is preferred more over linked lists than arrays.
5. Does Merge Sort require recursion?
Ans : Merge sort does require recursion.
6. What is stable sorting?
Ans : Stable sorting is sorting in which the order of elements with the same value remains same after the elements have been sorted.
Thank You!