Fastest Sorting Algorithm in C++
This article will explain which sorting algorithm will perform best in what conditions. The conditions include the type of data structure, the size of data being sorted, the data arrangement, and the data elements’ range.
Let’s first explain the sorting algorithm; then, we will explain the performance of various algorithms in the different data structures.
Fastest Sorting Algorithm in C++
The sorting algorithm is a method of arranging the element stored in any data structure.
The suitability of any sorting algorithm depends on the input data size, type of data structure, arrangement of data, time & space complexities, and range of the data.
Some sorting algorithms perform better on array data structures, while some perform better on heaps. In addition, some algorithms are speedy if the most significant integral key of the records is much smaller than the number of records (e.g., counting sort).
So, which one is the fastest algorithm? Though the answer seems simple, see the complexity table and choose one with the lowest time complexity (i.e., asymptotic growth).
However, in reality, we cannot directly say that algorithm A will perform better than algorithm B or vice versa on given data without seeing the underlying data’s structural properties.
Therefore, we will try to cater to the answer by discussing sorting algorithms with their particular suitability in different scenarios over a given underlying data structure.
Data Structure - Linked List
The link list is a linear data structure storing data in the node. A node is a fundamental building block of the link list that contains data and a pointer to the next node.
The best sorting algorithm to sort a link list is the merge sort. Merge sort performs better on linked lists than on arrays since there is no requirement for an auxiliary array to store the merge operation’s output.
Unlike arrays, linked list nodes are not necessary to be contiguous in memory. Instead, the nodes can be scattered in different memory locations.
Also, in contrast to an array, we can put a value at the center of a linked list in O(1)
additional space and O(1)
time.
As a result, additional asymptotically growing space is not required to implement the merge operation of the merge sort while sorting the link list. Therefore, merge sort will sort the link list in (nlog n)
.
You can find the implementation of the merge sort here.
Data Structure - Array
The array is the linear data structure that stores data into a consecutive memory location. The type of data must be the same in an array.
Here is a list of the various sorting algorithms and their appropriateness.
-
Insertion sort could be chosen when the array is nearly sorted because it rarely moves any items when adding a new item in the sorted region of the array.
The time complexity of an insertion sort on a sorted array is
O(n)
, but the time complexity of the quick sort on the same array isO(n^2)
. You can find more information here. -
Since Quick Sort does in-place sorting, it is suitable for arrays. Also, the quick sort algorithm does not require additional space during the sorting procedure.
-
In the case of merge sort, the additional space must be acquired and released. Therefore, the execution time of the merge sort algorithm will be increased.
On average, merge and quick sort have
O(nlogn)
time complexity, but the merge sort takes extraO(n)
space. Here then
is the size of the array to be sorted. You can find more about it here. -
When we want to sort the data (such as integers, words, and strings with fixed size characters) stored in an array in
lexicographically
order, then we use radix sort. Radix sort performs when using parallel machines. -
Counting sort is used when the largest integral key of the records is much smaller than the number of records.
-
Bucket sort is most efficient when the data stored in an array is distributed fairly within a range.
Data Structure - Tree
The tree is a non-linear data structure storing data in the nodes. The topmost node is called the root
node. The root
node is further attached to the child
nodes.
There are many types of trees, but here we only discuss the Binary Search Tree (BST).
Tree sort is considered the best sorting algorithm for BST. Also, the In-order traversal of BST gives us the elements in sorted order. Similarly, Heap sort is best for sorting the elements stored in a heap.
Remember, we can also analyze all the sorting algorithms on other types of data structures like hash tables, red-black trees, and much more.
The following section presents a cheat sheet depicting various sorting algorithms’ complexities and stability.
Algorithm Complexity Cheat Sheet
Before looking into the table, let’s discuss common terminologies associated with sorting algorithms.
Stable Sort
An algorithm is stable if, in the case of a tie between the keys, it preserves the original order of the keys. For example, assume the S
sequence has the following pairs:
S = <(1,"Alex"), (3,"Bill"),(2,"Ananth"), (1, "Jack")>
Now, assume we want to sort the above sequence by integral keys. Then, a stable sorting algorithm will sort the above sequence as:
Stably Sorted S = <(1,"Alex"),(1, "Jack"),(2,"Ananth"),(3,"Bill")>
See, the stable sort preserved the original order of pairs (1, "Alex")
and (1, "Jack")
. However, an unstable sort does not guarantee that.
In-Place Sort
A sort is in-place if it takes a constant amount of auxiliary memory. It means that the additional memory requirements do not increase with the increase in problem size.
For example, all the sorts in the below cheat sheet with O(1)
space complexity are in-place.
After having a sufficient background on the basic sort terminologies, let’s look at the complexity table for the sorting algorithms. This table can aid our decision to choose the most suitable sorting algorithm in a particular problem context.
Name | Time Complexity (Best) | Time Complexity (Average) | Time Complexity (Worst) | Space Complexity | Stability |
---|---|---|---|---|---|
Bubble Sort | Ω (n) |
Θ (n^2) |
O(n^2) |
Ο (1) |
Yes |
Selection Sort | Ω (n^2) |
Θ (n^2) |
O(n^2) |
Ο (1) |
No |
Insertion Sort | Ω (n) |
Θ (n^2) |
O(n^2) |
Ο (1) |
Yes |
Merge Sort | Ω (n log(n)) |
Θ (n log(n)) |
Ο (n log(n)) |
Ο (n) |
Yes |
Quick Sort | Ω (n log(n)) |
Θ (n log(n)) |
O(n^2) |
Ο (log(n)) |
No |
Heap Sort | Ω (n log(n)) |
Θ (n log(n)) |
Ο (n log(n)) |
Ο (1) |
No |
Counting Sort | Ω (n + k) |
Θ (n + k) |
Ο (n + k) |
Ο (K) |
Yes |
Radix Sort | Ω (nk) |
Θ (nk) |
Ο (nk) |
Ο (n + k) |
Yes |