Time and Space Complexity of Sorting Algorithms
Do you know how the efficiency of sorting algorithms is measured? The performance of these algorithms is primarily evaluated using two key metrics: time complexity and space complexity. Understanding these complexities is crucial for selecting the right sorting algorithm for a particular application, balancing between speed and resource usage. Let's understand more!
Sorting algorithms play a crucial role in solving complex problems. The real-world applications are bound by the physical memory and computational power of the systems. Here, time and space complexities become important because we don’t want a function that takes too much processing time or consumes excess memory.
Best-suited Data Structures and Algorithms courses for you
Learn Data Structures and Algorithms with these high-rated online courses
Table of Content
- What is Time Complexity?
- What is Space Complexity?
- Complexities of Sorting Algorithms
- Stability of Sorting Algorithms
- Cheat Sheet
What is Time Complexity?
Time complexity is described as the amount of time an algorithm takes to run as a function of length of input. Here, the input length indicates the number of elementary operations (arithmetic, memory reference, control flow, etc.) an algorithm executes throughout the program.
In general, time complexity can be expressed using three notations:
Big Oh Notation (O): It represents the worst-case time complexity of an algorithm. It gives the upper bound of an algorithm’s running time. Moreover, this notation calculates the maximum time an algorithm takes to execute completely.
Omega Notation (Ω): It represents the best-case time complexity of an algorithm. It gives the lower bound of an algorithm’s running time. Moreover, this notation calculates the minimum time an algorithm takes to execute completely.
Theta Notation (Θ): It represents the average-case time complexity of an algorithm. This notation calculates the average time an algorithm takes to execute completely.
What is Space Complexity?
Space complexity is defined as the amount of memory an algorithm takes to execute completely. It is also described as the total extra space required by the program to run. It is essential for solution optimization.
Complexities of Sorting Algorithms
Let's discuss the time and space complexity of 6 commonly used sorting algorithms:
Bubble Sort
Bubble sort is the simplest sorting algorithm in which we compare the adjacent elements and swap them if they are not in the correct order. This algorithm is suitable only for small data sets. It is a stable sorting algorithm.
Time Complexity
● Best Case: O(n)
When an array is already sorted, in the first iteration no swaps are performed. Knowing that no swaps are required, we can stop the sorting. Hence, the best case time complexity is linear, i.e., O(n).
● Average Case: O(n2)
For a random array, the average number of total swaps is (n2)/4, thus the average case complexity is O(n2).
● Worst Case: O(n2)
When the array is sorted in reverse order, in the first pass (n-1) swaps are performed, in the second (n-2) swaps, and so on. So the total number of swaps is n*(n-1)/2, and the overall complexity is O(n2).
Space Complexity
No extra memory is used by the algorithm apart from the original array, so the space complexity is O(1).
Selection Sort
Selection sort uses a brute force approach for sorting arrays. This algorithm divides an array into two subarrays: sorted and unsorted. The sorted subarray is initially empty; we iterate over the unsorted subarray to find the least element and insert it at the end of the sorted subarray.
Time Complexity
● For the first iteration, the algorithm performs (n-1) comparisons in an unsorted subarray, and after every iteration, the size of the subarray reduces by one. Thus, the sum of all comparisons (n-1) + (n-2) + (n-3) + …+1 is n*(n-1)/2, resulting in quadratic time complexity.
● The algorithm performs the same number of operations regardless of the given array, so the best, average, and worst-case complexities are the same.
Best Case = Average Case = Worst Case = O(n2)
Space Complexity
No extra memory is used apart from the original array, so the space complexity is O(1).
Insertion Sort
Similar to selection sort, insertion sorting divides an array into two subarrays: sorted and unsorted. However, in this case, the sorted subarray initially contains the first element. The algorithm iterates over the unsorted subarray and puts the elements at the correct position in the sorted part.
Time Complexity
● Best Case: O(n)
When an array is already sorted, the algorithm picks the first element from the unsorted subarray and puts it at the end of the sorted subarray. So, the best-case complexity is O(n).
● Average Case: O(n2)
For a random array, the average-case time complexity is O(n2).
● Worst Case: O(n2)
When the array is sorted in reverse order, the algorithm picks the first element from the unsorted subarray and puts it at the beginning of the sorted subarray. Since the total number of comparisons is n*(n-1)/2, the worst-case time complexity is O(n2).
Space Complexity
Since no extra memory is used apart from the original array, the space complexity is O(1).
Merge Sort
Merge sort is one of the most efficient sorting algorithms that follows the divide-and-conquer strategy. This technique divides the original array into ‘n’ subarrays, each of one size sorted trivially. These subarrays are then repeatedly merged to form the sorted array.
Time Complexity
● The algorithm recursively divides the original array into ‘log n’ subarrays. These subarrays are then repeatedly merged in sorted order, which takes linear time. Thus, the overall time complexity is O(n log n).
● The algorithm performs the same number of operations regardless of the given array, so the best, average, and worst-case complexities are the same.
Best Case = Average Case = Worst Case = O(n log n)
Space Complexity
Since an extra array of size at most ‘n’ is used to store the merged array, the space complexity is O(n).
Quick Sort
Quick sort is a quite complex sorting technique; it follows a divide-and-conquer strategy similar to merge sort. It chooses an element as a pivot and partitions the array around it. All the elements less than the pivot are moved to the left side and the greater elements to the right side. Then, recursive sorting of these subarrays takes place until the entire array is sorted.
Time Complexity
● Best Case: O(n)
If every time we pick the median element as the pivot, then it creates O(log n) subarrays. In this case, the overall best-case time complexity is O(n log n).
● Average Case: O(n2)
For a random array, the average-case time complexity is O(n log n).
● Worst Case: O(n2)
When the array is already sorted, we select the leftmost element as the pivot. Each subarray requires linear time for partitioning; hence, its overall worst-case complexity is O(n2).
Space Complexity
For recursive calls, the quick sort needs extra memory to create stack frames. Hence, the overall space-time complexity is O(n log n).
Heap Sort
The heap sorting algorithm converts the original array either into a min-heap or a max-heap. Then it repeatedly extracts the minimum or maximum elements from the heap and places them accordingly.
Time Complexity
● In general, O(log n) is required to fetch the minimum or maximum element from the heap, and we need to fetch n elements, so the overall time complexity will be O(log n) + O(log (n-1)) +… + O(1), i.e., O(n log n).
● This algorithm takes the same time regardless of the given array, so the best, average, and worst-case complexities are the same.
Best Case = Average Case = Worst Case = O(n log n)
Space Complexity
Heap sort doesn’t use extra memory; it simply converts the input array into a heap. Hence, the space complexity is O(1).
Stability of Sorting Algorithms
An algorithm is said to be stable if the original order of elements having the same key is preserved even after sorting. For example, if the two elements of an array, arr[i] and arr[j], have the same key and i<j, then after sorting, arr[i] should come before arr[j]. Algorithms like merge sort and insertion sort are known as stable sorting algorithms, whereas quick sort and heap sort are unstable sorting algorithms.
Cheat Sheet
Below is the cheat sheet, which will show the stability, time, and space complexity of different sorting algorithms.
Algorithm |
Time Complexity |
Space Complexity |
Stable |
||
Best |
Average |
Worst |
|||
Bubble Sort |
O(n) |
O(n2) |
O(n2) |
O(1) |
Yes |
Insertion Sort |
O(n) |
O(n2) |
O(n2) |
O(1) |
Yes |
Selection Sort |
O(n2) |
O(n2) |
O(n2) |
O(1) |
No |
Quick Sort |
O(n log n) |
O(n log n) |
O(n2) |
O(n log n) |
No |
Merge Sort |
O(n log n) |
O(n log n) |
O(n log n) |
O(n) |
Yes |
Heap Sort |
O(n log n) |
O(n log n) |
O(n log n) |
O(1) |
No |