What Is Quick Sort and How Does It Work? (With Examples)

programming code

What Is Quick Sort and How Does It Work? (With Examples)

Sorting algorithms usually fall into two camps — easy to implement or faster to run. Quick sort mostly falls into the latter category. Read on to find out how to implement this algorithm, and the best situations to use it for.

What is Quick Sort?

Quick sort is a sorting algorithm used for organizing arrays of data. It essentially relies on the principle known as divide-and-conquer. This is the method by which we break a bigger, more complex problem into simpler sub-problems. These sub-problems are then solved and the solutions are combined to find the solution to the original problem.

The Algorithm Behind Quick Sort

This isn’t exactly how to implement quick sort, but gives an idea of how it works.

//i -> Starting index, j --> Ending index
Quicksort(array, i, j)
if (i < j)
pIndex = Partition(A, i, j)
Quicksort(A,pIndex+1, end)

First, we define quick sort as a function of an array with a starting element and an ending element. The “if” statement checks that the array has more than one element in it.

In this case, we call the “partition” function, which gives us the index of the “pivot” element. This separates the array into two subarrays, with elements smaller and greater than the pivot respectively.

The function is recursively called on each subarray,until each subarray only has one element. The sorted array is then returned and the process is complete.

What is Quick Sort
In this example, the boxed element is the pivot element, blue elements are equal or smaller, and red elements are greater.

©Dcoetzee/Public Domain – License

An Example of Quick Sort

Like with most things, quick sort is best explained by using an example to illustrate.

Let’s take the following array – [56, 47, 98, 3, 6, 7, 11]

We’ve got indices from 0 to 6 (remember the first element is index 0, not 1).

Taking the last element as the pivot, the array is rearranged so that elements smaller than the pivot are on the left, and larger elements are on the right. This is done by initializing the i and j variables to 0. If arr[j], or the current element, is smaller than the pivot, we swap it with arr[i] and do this incrementally. The pivot is then swapped with arr[i] so that this element is in its sorted position.

This gives the subarrays [6, 7, 3] and [56, 47, 98]. The index of the pivot element is now 3 instead of 6.

Quick sort is then called, which sorts the left subarray around the pivot element, 3, by sorting the subarrays [6] and [7].

We then call quick sort recursively on the right subarray, so that it’s sorted into [47, 56, 98].

Finally, the subarrays are combined to give the sorted array – [3, 6, 7, 11, 47, 56, 98].


Now we’ve covered the basis behind quick sort, let’s implement it using Python. The code we use can be described as such:

def partition(arr, start, end):
    pivot = arr[end]
    i = start - 1
    for j in range(start, end):
        if arr[j] <= pivot:
           i = i + 1
           (arr[i], arr[j] = (arr[j], arr[i])
    (arr[i + 1], arr[end]) = (arr[end], arr[i + 1])
    return i + 1

def quicksort(arr, start, end):
    if start < end:
       pi = partition(arr, start, end)
       quicksort(arr, start, pi - 1)
       quicksort(arr, pi + 1, end)

array = [56, 47, 98, 3, 6, 7, 11]
quicksort(array, 0, len(array) - 1)
print(f'Sorted: {array}')

First, we’re defining a partition function as a function of an array, with a starting and ending index.

The pivot value is then set to the last element of the array, and i is initialized to the start index, minus 1.

The “for” loop iterates over the array, from the starting index to the end index minus 1.

The “if” statement swaps the current element, j, with the value at index i if j is less than or equal to the pivot. The variable i is then incremented.

After this, the pivot is swapped with the element at index i+1. This means that all elements to the left of the pivot are less than or equal to it, and elements to the right are greater than it.

The index of the pivot value is then returned.

“Quicksort” is then defined as a function of the array, and the array is checked to make sure it has more than one element.

The “partition” function is then called, with the index value set to “pi”. Quick sort is called recursively on the left and right subarrays, until each subarray only contains one element.

Finally, a sorted array is created, and is printed out using the “print” function.

What is Quick Sort
Quick sort is called recursively on the left and right subarrays, until each subarray only contains one element.


Best and Worst Use Cases of Quick Sort

While the theory behind quick sort may seem complicated at first, the algorithm has many advantages and is generally quite speedy. Let’s take a look at the time and space complexity of quick sort.

Time Complexity of Quick Sort

The table summarizes the time complexity of quick sort.

CaseTime Complexity
BestO (n log n)
AverageO (n log n)
WorstO (n2)

The best case is when the partition is balanced, where the pivot is close to or equal to the median value. As such, both subarrays are of similar size, and n operations is performed at each level. This leads to logarithmic time complexity.

When the pivot element is relatively close, this is the average case. The time complexity is the same as the best case since the arrays are roughly equal in size.

However, the worst case turns the time complexity into quadratic time. This is because the array is very unbalanced, where the pivot is close to the minimum or maximum element. This causes a situation where the subarrays are very uneven in size, with one containing only one element. As such, there are n levels of recursion as well as n operations, leading to a quadratic dependency on input size.

Space Complexity of Quick Sort

Another factor to consider is the space complexity of quick sort. This can be summarized as such:

CaseSpace Complexity
BestO (log n)
AverageO(log n)

The space complexity for quick sort is the same for the best and average cases. This is because the algorithm has log n recursive levels, and each recursive call uses a constant amount of memory space. As such, the total memory space is proportional to the depth of the recursion tree.

In the worst case, however, the space complexity is changed to O(n). Because the recursion tree is significantly unbalanced, meaning that there are n recursive calls.

Wrapping Up

Overall, quick sort is, as the name suggests, a very efficient way of sorting an array, particularly large ones. Once the process is understood, it’s relatively easy to implement and modify. It’s useful in a wide range of scenarios and works as a good foundation for more complex sorting algorithms.

Up Next…

Frequently Asked Questions

What is quick sort?

Quick sort is a sorting algorithm for sorting arrays of data. It works by selecting a pivot element, and partitioning the array into two subarrays, one with smaller elements than the pivot and one with larger elements. This process is recursively repeated until each subarray is sorted and contains only one element. The arrays are then combined to give a sorted array.

Is quick sort a stable algorithm?

Quick sort is usually an unstable algorithm. This means that the relative order of equal elements may not be preserved in the final output.

How do you choose the pivot element with quick sort?

You can choose either the first or last element, or make a randomized choice. With especially large datasets, randomizing the choice generally leads to a good performance.

What is the time complexity of quick sort?

The best and average case is O(n log n), while the worst case is O(n2).

What is the space complexity of quick sort?

The best and average cases are O(log n), whereas the worst case is O(n).

What are the best cases for using quick sort?

Quick sort can be used for many sorts of arrays, but sometimes, alternatives like heap sort or merge sort may work better, given certain constraints. This is typically where you need a stable algorithm like merge sort, or where time is a factor. For example, heap sort’s worst case time complexity is as good as the average case time complexity for quick sort. Simpler algorithms like selection or insertion sort may be quicker for small datasets also.

To top