Sorting algorithms usually fall into two camps — easy to implement or faster to run. Quick sort mostly falls into the latter category. Read on to find out how to implement this algorithm, and the best situations to use it for.
What is Quick Sort?
Quick sort is a sorting algorithm used for organizing arrays of data. It essentially relies on the principle known as divide-and-conquer. This is the method by which we break a bigger, more complex problem into simpler sub-problems. These sub-problems are then solved and the solutions are combined to find the solution to the original problem.
The Algorithm Behind Quick Sort
This isn’t exactly how to implement quick sort, but gives an idea of how it works.
//i -> Starting index, j --> Ending index Quicksort(array, i, j) { if (i < j) { pIndex = Partition(A, i, j) Quicksor(A,i,pIndex-1) Quicksort(A,pIndex+1, end) } }
First, we define quick sort as a function of an array with a starting element and an ending element. The “if” statement checks that the array has more than one element in it.
In this case, we call the “partition” function, which gives us the index of the “pivot” element. This separates the array into two subarrays, with elements smaller and greater than the pivot respectively.
The function is recursively called on each subarray,until each subarray only has one element. The sorted array is then returned and the process is complete.

©Dcoetzee/Public Domain – License
An Example of Quick Sort
Like with most things, quick sort is best explained by using an example to illustrate.
Let’s take the following array – [56, 47, 98, 3, 6, 7, 11]
We’ve got indices from 0 to 6 (remember the first element is index 0, not 1).
Taking the last element as the pivot, the array is rearranged so that elements smaller than the pivot are on the left, and larger elements are on the right. This is done by initializing the i and j variables to 0. If arr[j], or the current element, is smaller than the pivot, we swap it with arr[i] and do this incrementally. The pivot is then swapped with arr[i] so that this element is in its sorted position.
This gives the subarrays [6, 7, 3] and [56, 47, 98]. The index of the pivot element is now 3 instead of 6.
Quick sort is then called, which sorts the left subarray around the pivot element, 3, by sorting the subarrays [6] and [7].
We then call quick sort recursively on the right subarray, so that it’s sorted into [47, 56, 98].
Finally, the subarrays are combined to give the sorted array – [3, 6, 7, 11, 47, 56, 98].
Implementation
Now we’ve covered the basis behind quick sort, let’s implement it using Python. The code we use can be described as such:
def partition(arr, start, end): pivot = arr[end] i = start - 1 for j in range(start, end): if arr[j] <= pivot: i = i + 1 (arr[i], arr[j] = (arr[j], arr[i]) (arr[i + 1], arr[end]) = (arr[end], arr[i + 1]) return i + 1 def quicksort(arr, start, end): if start < end: pi = partition(arr, start, end) quicksort(arr, start, pi - 1) quicksort(arr, pi + 1, end) array = [56, 47, 98, 3, 6, 7, 11] quicksort(array, 0, len(array) - 1) print(f'Sorted: {array}')
First, we’re defining a partition function as a function of an array, with a starting and ending index.
The pivot value is then set to the last element of the array, and i is initialized to the start index, minus 1.
The “for” loop iterates over the array, from the starting index to the end index minus 1.
The “if” statement swaps the current element, j, with the value at index i if j is less than or equal to the pivot. The variable i is then incremented.
After this, the pivot is swapped with the element at index i+1. This means that all elements to the left of the pivot are less than or equal to it, and elements to the right are greater than it.
The index of the pivot value is then returned.
“Quicksort” is then defined as a function of the array, and the array is checked to make sure it has more than one element.
The “partition” function is then called, with the index value set to “pi”. Quick sort is called recursively on the left and right subarrays, until each subarray only contains one element.
Finally, a sorted array is created, and is printed out using the “print” function.

©History-Computer.com
Best and Worst Use Cases of Quick Sort
While the theory behind quick sort may seem complicated at first, the algorithm has many advantages and is generally quite speedy. Let’s take a look at the time and space complexity of quick sort.
Time Complexity of Quick Sort
The table summarizes the time complexity of quick sort.
Case | Time Complexity |
---|---|
Best | O (n log n) |
Average | O (n log n) |
Worst | O (n2) |
The best case is when the partition is balanced, where the pivot is close to or equal to the median value. As such, both subarrays are of similar size, and n operations is performed at each level. This leads to logarithmic time complexity.
When the pivot element is relatively close, this is the average case. The time complexity is the same as the best case since the arrays are roughly equal in size.
However, the worst case turns the time complexity into quadratic time. This is because the array is very unbalanced, where the pivot is close to the minimum or maximum element. This causes a situation where the subarrays are very uneven in size, with one containing only one element. As such, there are n levels of recursion as well as n operations, leading to a quadratic dependency on input size.
Space Complexity of Quick Sort
Another factor to consider is the space complexity of quick sort. This can be summarized as such:
Case | Space Complexity |
---|---|
Best | O (log n) |
Average | O(log n) |
Worst | O(n) |
The space complexity for quick sort is the same for the best and average cases. This is because the algorithm has log n recursive levels, and each recursive call uses a constant amount of memory space. As such, the total memory space is proportional to the depth of the recursion tree.
In the worst case, however, the space complexity is changed to O(n). Because the recursion tree is significantly unbalanced, meaning that there are n recursive calls.
Wrapping Up
Overall, quick sort is, as the name suggests, a very efficient way of sorting an array, particularly large ones. Once the process is understood, it’s relatively easy to implement and modify. It’s useful in a wide range of scenarios and works as a good foundation for more complex sorting algorithms.
Up Next…
- Selection Sort Algorithm Explained, With Examples
- What Is Insertion Sort, and How Does It Work? (With Examples)
- Kadane’s Algorithm: An Efficient Approach to the Maximum Subarray Problem, Explained with Photos
The image featured at the top of this post is ©whiteMocca/Shutterstock.com.