Sorting algorithms are a fundamental concept in computer science used to arrange elements of an array in a particular order. Sorting data is crucial in various applications, from search engines and data mining to scientific analysis and databases.
When choosing which sorting algorithm to use, there are some questions to ask: How big is the data being sorted? How much memory is available? Will the data grow? The response to these questions may determine which sorting algorithm best fits each situation. Before choosing a sorting algorithm, it is crucial to identify your requirements and consider any system constraints. In this guide, you will learn what sorting algorithm is, the different sorting algorithms, and how to use them. Let’s dive right in!
What Is a Sorting Algorithm?
Sorting refers to arranging data in a specific format. Sorting algorithms are a set of instructions used to arrange elements of an array or list in a particular order. Sorts are commonly in lexicographical or numerical form and can be in ascending or descending order.
For example,

©History-Computer.com
In the above example, we are sorting in ascending order.
To complete the above operation, there are various sorting algorithms that can be implemented, and any algorithm can be used based on the requirement.
Common Sorting Algorithms
- Selection sort
- Bubble sort
- Insertion sort
- Merge sort
- Quick sort
- Heap sort
- Counting sort
- Bucket sort
- Radix sort
- Shell sort
However, before we get into each of the sorting algorithms, let us look at their complexity, including both time and space complexity.
The Complexity of Sorting Algorithms
The efficiency of any sorting algorithm is usually determined by the space and time complexity of the algorithm.
Time Complexity
Refers to the time an algorithm takes to complete its execution based on the input size. It can be represented in three different forms:
- Big-O notation (O)
- Omega notation (Ω)
- Theta notation (θ)
Space Complexity
This is the total amount of memory an algorithm requires for complete execution. It includes both the input and the auxiliary memory.
Auxiliary memory is the extra space occupied by the algorithm apart from the input data. Typically, auxiliary memory is taken into account when determining an algorithm’s space complexity.
Here is a complexity analysis of the different sorting algorithms.
Sorting Algorithm | Time Complexity – Best Case | Time Complexity- Average Case |
---|---|---|
Bubble Sort | n | n2 |
Selection Sort | n2 | n2 |
Insertion Sort | n | n2 |
Merge Sort | n log n | n log n |
QuickSort | n log n | n log n |
Counting Sort | n+k | n+k |
Radix Sort | n+k | n+k |
Bucket Sort | n+k | n |
Heap Sort | n log n | n log n |
Shell Sort | n log n | n log n |
Algorithm | Time Complexity- Worst Case | Space Complexity |
---|---|---|
Bubble Sort | n2 | 1 |
Selection Sort | n2 | 1 |
Insertion Sort | n2 | 1 |
Merge Sort | n log n | log n |
QuickSort | n2 | log n |
Counting Sort | n+k | max |
Radix Sort | n+k | max |
Bucket Sort | n2 | n+k |
Heap Sort | n log n | 1 |
Shell Sort | n2 | 1 |
The Stability of Sorting Algorithms
When two or more items with the same value maintain their relative positions after sorting, the sorting algorithm is said to be stable.
For example, the image below contains two items with the same value of 4. An unstable sorting algorithm will allow two possibilities where the two positions of 4 may not be maintained as illustrated below.

©History-Computer.com
However, after a stable sorting, there is only one possible outcome meaning the positions are maintained. Check out below:

©History-Computer.com
Below is a table summarizing the stability of different sorting algorithms.
Sorting Algorithm | Stability |
---|---|
Bubble Sort | Yes |
Selection Sort | No |
Insertion Sort | Yes |
Merge Sort | Yes |
QuickSort | No |
Counting Sort | Yes |
Radix Sort | Yes |
Bucket Sort | Yes |
Heap Sort | No |
Shell Sort | No |
How Are Sorting Algorithms Classified?
You can classify sorting algorithms based on these parameters:
- The number of inversions or swaps required: Implies the number of repetitions the algorithm swaps elements to sort the input data. Selection sort requires the least swapping.
- The number of comparisons: Refers to how many times the algorithm compares the elements to sort input data.
- Do they use recursion or not: some algorithms use recursive methods to sort a list, for example, quick sort. Also, other algorithms will use both recursive and non-recursive methods to sort array elements.
- If they are unstable or stable: Stable sorting algorithms keep the original order with the same values even after sorting, while unstable sorting algorithms don’t keep the relative order of elements even with the same values.
- Total extra memory space required: Some algorithms can sort lists of data without creating new lists. These algorithms are called in-place sorting and require a constant 0(1) extra space for sorting. On the other hand, out-of-place sorting creates new lists when sorting, for example, merge sort.
Now let’s explore the 10 different sorting algorithms, and how to use them.
#1. Selection Sort Algorithm
Selection Sort is one of the easiest sorting algorithms that work by finding the minimum value in an unsorted list and swapping it with the first element. The process continues until the entire list gets sorted.
How Does Selection Sort Algorithm Work?
Here is how it works:
- Select the least element from the unsorted array and swap it with the 1st element.
- Select the second least element and swap it with the 2nd element in the list.
- Select the third least element from the list and swap it with the 3rd element in the list.
- Iterate this process of finding the next least element and placing it into the correct position until the array gets sorted.
For a quick video explaining selection sort, check out the one below:
Selection Sort Implementation
Selection Sort algorithm can be implemented in various programming languages, such as Java, Python, and C/C++.
Here is how the algorithm is implemented in Python:

©History-Computer.com
Properties
- Space Complexity – O(n)
- Time Complexity – O(n2)
- Sorting in Place – Yes
- Stable – No
Selection Sort Applications
Selection sort is used when:
- Swapping cost doesn’t matter
- A small list is to be sorted
- Checking all elements is compulsory
- Cost of writing to a memory space matters
#2. Bubble Sort Algorithm
Bubble sort is a simple sorting algorithm that traverses a list, comparing two adjacent values and swapping them until they are in the correct order.
Similar to how bubbles in the water rise from the bottom to the surface, each element in the lists bubbles up to the top in each iteration, hence the name Bubble Sort.
It has a worst-case complexity of O(n^2), making it a slower sorting algorithm than other sorting algorithms, like quick sort. However, bubble sort has the advantage of being one of the simplest sorting algorithms to comprehend and code from scratch.
Technically, bubble sort is well suited to sorting small arrays or when executing sorting algorithms on computers with extremely limited memory space.
How Does Bubble Sort Work?
Let’s sort a list in ascending order.
Example:
The first time through the list:
- If we start with the list, [3, 1, 5, 2, 8], the algorithm compares the first two elements in the list, 3 and 1. Since 3 > 1, it swaps them: [1, 3, 5, 2, 8]
- Now it compares the next two values, 3 and 5, and no swapping is required as 3 < 5, and they are in order, the algorithm continues: [1, 3, 5, 2, 8]
- Moving to the next two values, 5 and 2, they get swapped since 2 < 5: [1, 3, 2, 5, 8]
- For the next two values, 5 and 8, no swapping is required, as they are already in order: [1, 3, 2, 5, 8]
The second time through the list:
- 1 < 3, no need to swap positions: [1, 3, 2, 5, 8]
- The algorithm will swap 3 and 2 since 2 < 3: [1, 2, 3, 5, 8]
- No swap since 3 < 5: [1, 2, 3, 5, 8]
- Again, no swapping positions since the two values, 5 and 8, are in order: [1, 2, 3, 5, 8]
The list is now sorted, but the algorithm does not realize it. It will pass through all the elements without swapping to know if the list is in order.
The third time through the list:

©History-Computer.com
Though bubble sort is an easy algorithm to implement, it is not as efficient as other sorting algorithms.
How Is the Bubble Sort Algorithm Implemented?
Bubble sort can be implemented in various programming languages, such as Python, Java, and C/C++
Here’s how its implemented in C++:

©History-Computer.com
Bubble Sort Properties
- Space complexity – O(1)
- Best case performance – O(n)
- Average case performance – O(n*n)
- Worst case performance – O(n*n)
- Stable – Yes
Bubble Sort Algorithm Application
The Bubble sort algorithm is used if:
- Complexity doesn’t matter
- Short and simple code is preferred
#3. Insertion Sort Algorithm
Insertion sort is a simple sorting algorithm that involves inserting each element in an array into its correct position. It’s used when dealing with a small number of elements.
How Insertion Sort Works
In this algorithm, we compare the key elements with the previous element, and if the previous values are greater than the key value, then we move the previous value to the next position.
Example:
Start from the 1st index to the size of the input list.
[9, 4, 6, 2, 5, 3]
Step 1:
Key = 4 // starting from index 1
Here we compare ‘key’ with the previous elements.
‘Key’ is compared with 9. Since 9 > 4, move the 9 to the subsequent position and insert ‘key’ in the prior position.
Results: [4, 9, 6, 2, 5, 3]
Step 2:

©History-Computer.com
Key = 6 // second index
Since 9 > 6, move 9 to the 2nd index and insert 6 to the 1st index.
Result : [4, 6, 9, 2, 5, 3]
Step 3:

©History-Computer.com
Key = 2 // 3rd index
9 > 2 → [4, 6, 2, 9, 5, 3]
6 > 2 → [4, 2, 6, 9, 5, 3]
4 > 2 → [2, 4, 6, 9, 5, 3]
Result : [2, 4, 6, 9, 5, 3]
Step 4:

©History-Computer.com
Key = 5 // 4th index
9 > 5 → [2, 4, 6, 5, 9, 3]
6 > 5 → [2, 4, 5, 6, 9, 3]
4 > 5 ==! Stop
Result : [2, 4, 5, 6, 9, 3]
Step 5:

©History-Computer.com
Key = 3 // 5th index
9 > 3 → [2, 4, 5, 6, 3, 9]
6 > 3 → [2, 4, 5, 3, 6, 9]
5 > 3 → [2, 4, 3, 5, 6, 9]
4 > 3 → [2, 3, 4, 5, 6, 9]
1 > 3 ==! Stop
Result : [2, 3, 4, 5, 6, 9]
To avoid swapping the key index in every iteration, below is an optimized insertion sort algorithm, where the key element will be switched at the final step.
InsertionSort(arr[])
for j = 1 to arr.length
key = arr[j]
i = j – 1
while i > 0 and arr[i] > key
arr[i+1] = arr[i]
i = i – 1
arr[i+1] = key
Insertion Sort Implementation in JavaScript:

©History-Computer.com
Properties of Insertion Sort
- Space Complexity – O(1)
- Time Complexity – O(n), O(n* n), O(n* n) for Best, Average, and Worst cases respectively.
- Best Case – the array is already sorted
- Average Case – the array is randomly sorted
- Worst Case – the array is reversely sorted.
- Sorting In-Place: Yes
- Stable: Yes
Applications of Insertion Sort Algorithm
Insertion sort is used when:
- The array has small elements
- There are a few elements left for sorting
#4. Radix Sorting Algorithm
Radix sorting is a sorting algorithm that works by first grouping elements of the same place value, then sorting them in either ascending or descending order. The Radix sort algorithm emerges as an idea to extend the Counting Sort to achieve a better time complexity.
How Radix Sort Works
For each digit, n, where n varies from the least to the most significant digit of a number, use Count Sort to sort the input array according to the nth digit. We use Count Sort since it’s a stable algorithm.
Example:
Given the input array [10, 21, 17, 34, 44, 11, 654, 123], we will sort it according to the least significant/ one’s digit based on the algorithm.
0: 10
1: 21, 11
2:
3: 123
4: 34, 44, 654
5:
6:
7: 17
8:
9:
So, the array becomes [10, 21, 11, 123, 34, 44, 654].
Let’s sort based on the ten’s digit:
0:
1: 10, 11, 17
2: 21, 123
3: 34
4: 44
5: 654
6:
7:
8:
9:
Now, the array is: [10, 11, 17, 21, 123, 34, 44, 654]
Finally, let’s sort according to the most significant digit ( hundred’s digit):
0: 010 011 017 021 034 044
1: 123
2:
3:
4:
5:
6: 654
7:
8:
9:
Result : [10, 11, 17, 21, 34, 44, 123, 654], thus it is sorted. This is how the algorithm works.
Radix Sort Implementation in C:

©History-Computer.com
#5. Merge Sort Algorithm
Merge sort is a sorting algorithm that incorporates the Divide and Conquer Algorithm principle. It splits the array into subproblems, that is, divides, and when the solution to the subproblem is ready, it combines the results from all subproblems to solve the main problem.
How Does Merge Sort Work?
This process has only three steps involved:
- Divide – splitting the array into subproblems
- Conquer – sorting the subproblems using the same algorithm
- Combine – merge the results
Example:
Using the unsorted array [6, 5, 12, 10, 9, 1]

©History-Computer.com
Merge Sort Implementation In JavaScript

©History-Computer.com
Properties
- Space Complexity – O(n)
- Time Complexity – O(n*log(n)). The log(n) factor is because of the recurrence relation.
- In-Place Sorting – No in a typical implementation
- Stable – Yes
- Parallelizable – yes
Applications
- E-commerce
- External sorting
- Inversion count problem
#6. Quicksort Algorithm
Quicksort is a sorting algorithm based on Divide and Conquer algorithms.
How Quicksort Algorithm Works
It involves the following steps:
- Select an element as a pivot, in our case, the pivot is the last element.
- Partitioning – sorts the element by ensuring all the values less than the pivot are on the left, whereas all the values greater than the pivot are on the right.
- While ensuring the pivot to the left and the right are properly subdivided, call quicksort recursively.
Quicksort Implementation in Python3

©History-Computer.com
Quicksort Applications
This algorithm is used when:
- Space complexity matters
- Time complexity matters
- Programming language works well with recursion.
#7. Counting Sort Algorithm
Counting sort is an algorithm that sorts the input array elements by counting the number of occurrences of every value in the array. Sorting is done by mapping the occurrence as an auxiliary array index and then stored in an auxiliary array.
How Counting Sort Works
Example:
Given the input = [2, 5, 3, 1, 4, 2]
Create a list of occurrences of each unique value
Count = [0, 0, 0, 0, 0, 0]
Val: 0 1 2 3 4 5
Go through the input list and iterate the index by one (for each value)
For example, the 1st value in the list is 2, so add one to the second index value of the count list, with value 2:
Count = [0, 0, 1, 0, 0, 0]
Val: 0 1 2 3 4 5
Continue to achieve the total count for each value in the list:
Count = [0, 1, 2, 1, 1, 1]
Val: 0 1 2 3 4 5
You can easily create a sorted output since you are aware of the occurrence of each value in the list. Loop through the occurrence list and add the corresponding value to the output array.
For example, there were no 0’s in the input list, but there was an occurrence of value 1, now add the output array once:
Output = [ 1 ]
For value 2, we have two occurrences, so add to the output list:
Output = [ 1, 2, 2]
Continue until you have a sorted output list:
Output = [ 1, 2, 2, 3, 4, 5 ]
Counting Sort Implementation in JavaScript

©History-Computer.com
Properties
- Space complexity – 0 (k)
- Best case performance – 0(n+k)
- Average case performance – 0(n+k)
- Worst case performance – 0(n+k)
- Stable – Yes
#8. Heap Sort Algorithm
Heap sort is an excellent sorting algorithm that works by using the max/ min heaps. A heap is a unique tree-based data structure that follows the heap property. This means, for a maximum heap, the child node in a tree is less than or equal to the parent node.
This algorithm is an unstable algorithm and thus when sorting the initial ordering would not be preserved. It has a best and worst-case time complexity of O(n log n) and since it’s a comparison-based algorithm, it can be used for non-numerical data sets.
Advantages of Heapsort
- Minimal memory usage
- Simple to understand and implement
- Efficient
Disadvantages
- Expensive
- Unstable
Heap Sort Implementation in Java

©History-Computer.com

©History-Computer.com

©History-Computer.com
#9. Bucket Sort Algorithm
Bucket sort is a sorting algorithm that sub-divides input array elements into smaller groups known as buckets. The buckets are then sorted using an algorithm of your choice. After sorting, the buckets are merged to form the final sorted array.
With bucket sort, the same array can be completed in 0(n) time.
Bucket Sort Pseudo Code

©History-Computer.com
#10. Shell Sorted Algorithm
Shell sort is generally a variation of Insertion Sort. It sorts the elements by reducing the interval between the elements, based on the sequence used.
How Shell Sort Works
Having an array n, we keep reducing the value of n until it becomes 1. An array is said to be n-sorted if the sublists of every nth element are sorted.
Algorithm
- Step 1 – Start
- Step 2 – Initialize the gap size value, for example, n.
- Step 3 – Sub-divide the list into smaller lists. We need equal intervals to n.
- Step 4 – Use insertion sort to sort your sub-lists.
- Step 5 – Repeat step 2 until the list is sorted.
- Step 6 – Print the sorted list.
- Step 7 – Stop
How Shell Sort Is Implemented

©History-Computer.com
Conclusion
Sorting algorithms are a fundamental concept of computer science since they efficiently organize and manage large data sizes. Each algorithm has its pros and cons in terms of simplicity, efficiency, and space complexity. The best algorithm for you depends on the specific problem you are trying to solve, as well as the size and complexity of your data.