Time and Space Complexity of Sorting Algorithms

Time and Space Complexity of Sorting Algorithms

7 mins readComment
Updated on Oct 13, 2024 04:52 IST

Do you know how the efficiency of sorting algorithms is measured? The performance of these algorithms is primarily evaluated using two key metrics: time complexity and space complexity. Understanding these complexities is crucial for selecting the right sorting algorithm for a particular application, balancing between speed and resource usage. Let's understand more!

Sorting algorithms play a crucial role in solving complex problems. The real-world applications are bound by the physical memory and computational power of the systems. Here, time and space complexities become important because we don’t want a function that takes too much processing time or consumes excess memory.

Recommended online courses

Best-suited Data Structures and Algorithms courses for you

Learn Data Structures and Algorithms with these high-rated online courses

– / –
4 months
– / –
150 hours
Free
4 weeks
Free
40 hours
– / –
200 hours
– / –
6 months
Free
25 hours
Free
1 hours
– / –
8 weeks
– / –
8 hours

Table of Content

What is Time Complexity?

Time complexity is described as the amount of time an algorithm takes to run as a function of length of input. Here, the input length indicates the number of elementary operations (arithmetic, memory reference, control flow, etc.) an algorithm executes throughout the program.

In general, time complexity can be expressed using three notations:

Big Oh Notation (O): It represents the worst-case time complexity of an algorithm. It gives the upper bound of an algorithm’s running time. Moreover, this notation calculates the maximum time an algorithm takes to execute completely.

Omega Notation (Ω): It represents the best-case time complexity of an algorithm. It gives the lower bound of an algorithm’s running time. Moreover, this notation calculates the minimum time an algorithm takes to execute completely.

Theta Notation (Θ): It represents the average-case time complexity of an algorithm. This notation calculates the average time an algorithm takes to execute completely.

What is Space Complexity?

Space complexity is defined as the amount of memory an algorithm takes to execute completely. It is also described as the total extra space required by the program to run. It is essential for solution optimization.

Complexities of Sorting Algorithms

Let's discuss the time and space complexity of 6 commonly used sorting algorithms:

Bubble Sort

Bubble sort is the simplest sorting algorithm in which we compare the adjacent elements and swap them if they are not in the correct order. This algorithm is suitable only for small data sets. It is a stable sorting algorithm.

Time Complexity

Best Case: O(n)

When an array is already sorted, in the first iteration no swaps are performed. Knowing that no swaps are required, we can stop the sorting. Hence, the best case time complexity is linear, i.e., O(n).

Average Case: O(n2)

For a random array, the average number of total swaps is (n2)/4, thus the average case complexity is O(n2).

Worst Case: O(n2)

When the array is sorted in reverse order, in the first pass (n-1) swaps are performed, in the second (n-2) swaps, and so on. So the total number of swaps is n*(n-1)/2, and the overall complexity is O(n2).

Space Complexity

No extra memory is used by the algorithm apart from the original array, so the space complexity is O(1).

Selection Sort

Selection sort uses a brute force approach for sorting arrays. This algorithm divides an array into two subarrays: sorted and unsorted. The sorted subarray is initially empty; we iterate over the unsorted subarray to find the least element and insert it at the end of the sorted subarray.

Time Complexity

● For the first iteration, the algorithm performs (n-1) comparisons in an unsorted subarray, and after every iteration, the size of the subarray reduces by one. Thus, the sum of all comparisons (n-1) + (n-2) + (n-3) + …+1 is n*(n-1)/2, resulting in quadratic time complexity.

● The algorithm performs the same number of operations regardless of the given array, so the best, average, and worst-case complexities are the same.

Best Case = Average Case = Worst Case = O(n2)

Space Complexity

No extra memory is used apart from the original array, so the space complexity is O(1).

Insertion Sort

Similar to selection sort, insertion sorting divides an array into two subarrays: sorted and unsorted. However, in this case, the sorted subarray initially contains the first element. The algorithm iterates over the unsorted subarray and puts the elements at the correct position in the sorted part.

Time Complexity

Best Case: O(n)

When an array is already sorted, the algorithm picks the first element from the unsorted subarray and puts it at the end of the sorted subarray. So, the best-case complexity is O(n).

Average Case: O(n2)

For a random array, the average-case time complexity is O(n2).

● Worst Case: O(n2)

When the array is sorted in reverse order, the algorithm picks the first element from the unsorted subarray and puts it at the beginning of the sorted subarray. Since the total number of comparisons is n*(n-1)/2, the worst-case time complexity is O(n2).

Space Complexity

Since no extra memory is used apart from the original array, the space complexity is O(1).

Merge Sort

Merge sort is one of the most efficient sorting algorithms that follows the divide-and-conquer strategy. This technique divides the original array into ‘n’ subarrays, each of one size sorted trivially. These subarrays are then repeatedly merged to form the sorted array.

Time Complexity

● The algorithm recursively divides the original array into ‘log n’ subarrays. These subarrays are then repeatedly merged in sorted order, which takes linear time. Thus, the overall time complexity is O(n log n).

● The algorithm performs the same number of operations regardless of the given array, so the best, average, and worst-case complexities are the same.

Best Case = Average Case = Worst Case = O(n log n)

Space Complexity

Since an extra array of size at most ‘n’ is used to store the merged array, the space complexity is O(n).

Quick Sort

Quick sort is a quite complex sorting technique; it follows a divide-and-conquer strategy similar to merge sort. It chooses an element as a pivot and partitions the array around it. All the elements less than the pivot are moved to the left side and the greater elements to the right side. Then, recursive sorting of these subarrays takes place until the entire array is sorted.

Time Complexity

Best Case: O(n)

If every time we pick the median element as the pivot, then it creates O(log n) subarrays. In this case, the overall best-case time complexity is O(n log n).

Average Case: O(n2)

For a random array, the average-case time complexity is O(n log n).

Worst Case: O(n2)

When the array is already sorted, we select the leftmost element as the pivot. Each subarray requires linear time for partitioning; hence, its overall worst-case complexity is O(n2).

Space Complexity

For recursive calls, the quick sort needs extra memory to create stack frames. Hence, the overall space-time complexity is O(n log n).

Heap Sort

The heap sorting algorithm converts the original array either into a min-heap or a max-heap. Then it repeatedly extracts the minimum or maximum elements from the heap and places them accordingly.

Time Complexity

● In general, O(log n) is required to fetch the minimum or maximum element from the heap, and we need to fetch n elements, so the overall time complexity will be O(log n) + O(log (n-1)) +… + O(1), i.e., O(n log n).

● This algorithm takes the same time regardless of the given array, so the best, average, and worst-case complexities are the same.

Best Case = Average Case = Worst Case = O(n log n)

Space Complexity

Heap sort doesn’t use extra memory; it simply converts the input array into a heap. Hence, the space complexity is O(1).

Stability of Sorting Algorithms

An algorithm is said to be stable if the original order of elements having the same key is preserved even after sorting. For example, if the two elements of an array, arr[i] and arr[j], have the same key and i<j, then after sorting, arr[i] should come before arr[j]. Algorithms like merge sort and insertion sort are known as stable sorting algorithms, whereas quick sort and heap sort are unstable sorting algorithms.

Selection Sort Algorithm in C
Selection Sort Algorithm in C
Ever wondered how your favorite music app arranges songs from least to most played, or how an online store lists products from cheapest to priciest? At the heart of such...read more

Array Programs in Java | Beginner to Expert Level
Array Programs in Java | Beginner to Expert Level
Array programs in Java traverse from basic single-dimensional arrays to complex multi-dimensional arrays and dynamic arrays using ArrayList. From initializing and accessing array elements, to advanced operations like sorting and...read more

How to Return an Array in Java
How to Return an Array in Java
In Java, methods can return arrays to provide multiple data elements of a consistent type in a single response. There are various ways on how to return an array. Today...read more

Cheat Sheet

Below is the cheat sheet, which will show the stability, time, and space complexity of different sorting algorithms.

Algorithm

Time Complexity

Space Complexity

Stable

Best

Average

Worst

Bubble Sort

O(n)

O(n2)

O(n2)

O(1)

Yes

Insertion Sort

O(n)

O(n2)

O(n2)

O(1)

Yes

Selection Sort

O(n2)

O(n2)

O(n2)

O(1)

No

Quick Sort

O(n log n)

O(n log n)

O(n2)

O(n log n)

No

Merge Sort

O(n log n)

O(n log n)

O(n log n)

O(n)

Yes

Heap Sort

O(n log n)

O(n log n)

O(n log n)

O(1)

No

 

About the Author
This is a collection of insightful articles from domain experts in the fields of Cloud Computing, DevOps, AWS, Data Science, Machine Learning, AI, and Natural Language Processing. The range of topics caters to upski Read Full Bio