Sorting Algorithms
Comparison sorts, linear sorts, stability, lower bounds, and GATE-level analysis.
Last updated: April 2026 | GATE CS 2024–2026 syllabus
Key Takeaways
- Lower bound for comparison-based sorting: Ω(n log n). No comparison sort beats this.
- Merge sort: always Θ(n log n), stable, not in-place. Best for linked lists.
- Quick sort: O(n log n) avg, O(n²) worst, in-place, not stable — fastest in practice.
- Heap sort: always O(n log n), in-place, not stable.
- Counting/Radix/Bucket sort break the lower bound — they are NOT comparison-based.
- Stable sorts preserve relative order of equal elements.
- Insertion sort is best for nearly-sorted or small arrays: O(n) best, O(n²) worst.
1. Comparison Sort Summary
| Algorithm | Best | Average | Worst | Space | Stable | In-place |
|---|---|---|---|---|---|---|
| Bubble sort | O(n) | O(n²) | O(n²) | O(1) | Yes | Yes |
| Selection sort | O(n²) | O(n²) | O(n²) | O(1) | No | Yes |
| Insertion sort | O(n) | O(n²) | O(n²) | O(1) | Yes | Yes |
| Merge sort | O(n log n) | O(n log n) | O(n log n) | O(n) | Yes | No |
| Quick sort | O(n log n) | O(n log n) | O(n²) | O(log n) | No | Yes |
| Heap sort | O(n log n) | O(n log n) | O(n log n) | O(1) | No | Yes |
| Shell sort | O(n log n) | O(n log²n) | O(n²) | O(1) | No | Yes |
2. Bubble Sort
Repeatedly swap adjacent out-of-order elements. After pass k, the k largest are in place.
Swaps (worst): n(n−1)/2 = Θ(n²)
Comparisons (best, with early-exit flag): n−1 = O(n)
Stable: Yes (swap only if strictly greater — equal elements not swapped)
Optimisation: add a swapped flag; if no swap in a pass, array is sorted → O(n) best case.
3. Selection Sort
Find minimum in unsorted region; swap to front. Always Θ(n²) comparisons regardless of input.
Swaps: exactly n−1 (minimum swaps among O(n²) sorts)
Stable: No — swap can skip over equal elements
Use when: swaps are expensive (write cost >> compare cost)
4. Insertion Sort
Build sorted prefix by inserting each element into its correct position via shifting.
Comparisons (worst — reverse sorted): n(n−1)/2
Inversions = number of shifts needed. If I inversions, runtime = Θ(n+I).
Stable: Yes. Online: Yes (processes elements one by one).
Best for: small n (≤20), nearly sorted arrays, online data
5. Merge Sort
Divide array in half, recursively sort each half, merge. Recurrence: T(n)=2T(n/2)+Θ(n).
Stable: Yes Passes: log n levels
Merge step: O(n) — compare front elements of two sorted halves
Total comparisons: between n/2·log n and n·log n
Merge sort on linked list: O(n log n) time, O(log n) space (stack) — no random access needed
6. Quick Sort
Choose a pivot, partition array (elements < pivot | pivot | elements > pivot), recurse on both sides.
Worst (sorted input, first-element pivot): T(n)=T(n−1)+n ⇒ Θ(n²)
Space: O(log n) stack avg, O(n) worst
Stable: No (partition skips over equal elements)
Partition (Lomuto): i=lo−1; for j=lo..hi: if a[j]≤pivot: swap a[++i],a[j]
Partition (Hoare): two pointers from ends — fewer swaps, preferred
Randomised quicksort: expected O(n log n) for any input
3-way partition (Dutch flag): handles duplicates in O(n) for all-equal input
7. Heap Sort
Build max-heap O(n), then repeatedly extract max into sorted suffix O(n log n).
Stable: No (extract-max changes relative order)
Build-heap: O(n) — not O(n log n)
Total comparisons: at most 2n log n
NOT cache-friendly (heap jumps around memory) — slower than quicksort in practice
8. Linear Sorts (Non-Comparison)
| Algorithm | Time | Space | Stable | Constraint |
|---|---|---|---|---|
| Counting sort | O(n+k) | O(k) | Yes | Integer keys in [0,k] |
| Radix sort | O(d(n+k)) | O(n+k) | Yes | d digits, each in [0,k] |
| Bucket sort | O(n) avg | O(n+k) | Yes | Uniform distribution in [0,1) |
Radix sort: Sort by least significant digit first using stable counting sort. d passes of O(n+k) each.
For 32-bit integers: d=4 passes (base 28=256), k=256. Total: O(4(n+256))=O(n).
Bucket sort: Divide [0,1) into n equal buckets, insertion-sort each, concatenate. E[T]=O(n) if uniform.
9. Lower Bound Proof
n elements ⇒ n! possible permutations ⇒ at least n! leaves.
Binary tree height ≥ log2(n!) = log2(1·2·…·n)
By Stirling: log2(n!) ≈ n log2n − n log2e = Θ(n log n)
∴ Any comparison sort requires Ω(n log n) comparisons in the worst case.
10. GATE Examples
Expected comparisons = 2n ln n ≈ 1.386n log2n. Answer: Θ(n log n) with constant ~1.39.
Selection sort makes exactly n−1 swaps = 2(n−1) writes. It minimises writes. Answer: Selection sort.
No auxiliary array needed (merge in-place on pointers). Recursion stack = O(log n). Answer: O(log n).
11. Common Mistakes
- Claiming quicksort is always O(n log n): It is O(n log n) average, but O(n²) worst. Without randomisation, sorted input kills it.
- Saying heapsort is O(n) build + O(n log n) extraction = O(n log n): Correct answer, but forgetting that O(n) build is a common mistake — many write O(n log n) for build-heap.
- Radix sort on floats: Radix sort requires integers (or fixed binary representation). Cannot directly sort arbitrary floats.
- Stability of merge sort depends on implementation: Must use ≤ (not <) when choosing from left subarray to ensure stability.
- Best case of bubble sort without the flag: Without a swapped flag, standard bubble sort is always O(n²). Only the optimised version achieves O(n) best.
12. FAQ
- Why is the lower bound for comparison-based sorting Ω(n log n)?
- Any comparison-based sort corresponds to a decision tree with n! leaves. A binary tree with n! leaves has height at least log2(n!) = Θ(n log n) by Stirling’s approximation. This height equals the number of comparisons in the worst case.
- Which sorting algorithms are stable?
- Stable: Bubble sort (with correct implementation), Insertion sort, Merge sort, Counting sort, Radix sort, Bucket sort. Not stable: Selection sort, Quick sort (standard), Heap sort.
- When is counting sort faster than merge sort?
- When the key range k = O(n), counting sort is O(n), breaking the Ω(n log n) barrier because it exploits the structure of keys rather than comparing them. If k ≫ n (e.g., k=n²), counting sort is slower.
- What is the best pivot strategy for quick sort?
- Randomised pivot or median-of-three gives O(n log n) expected time. The worst case occurs with sorted/reverse-sorted input using first or last element as pivot, giving O(n²) because every partition is maximally unbalanced (1 : n−1).