Divide & Conquer
Binary Search, Merge Sort, Quick Sort, Strassen, Karatsuba, and more — with recurrence analysis for GATE CS.
Last updated: April 2026 | GATE CS 2024–2026 syllabus
Key Takeaways
- D&C: Divide into subproblems → Conquer recursively → Combine results.
- Recurrence T(n)=aT(n/b)+f(n) → solve with Master theorem.
- Binary search: T(n)=T(n/2)+1 → Θ(log n).
- Merge sort: T(n)=2T(n/2)+n → Θ(n log n).
- Strassen: T(n)=7T(n/2)+n² → Θ(nlog27) ≈ O(n2.81).
- Karatsuba: T(n)=3T(n/2)+n → Θ(nlog23) ≈ O(n1.585).
- D&C is optimal when subproblems are independent and combine is efficient.
1. The D&C Paradigm
1. Divide: Split problem of size n into a subproblems each of size n/b.
2. Conquer: Solve recursively. If size ≤ threshold, solve directly.
3. Combine: Merge solutions. Cost = f(n).
Recurrence: T(n) = aT(n/b) + f(n)
Apply Master theorem to find Θ.
| Algorithm | a | b | f(n) | T(n) |
|---|---|---|---|---|
| Binary search | 1 | 2 | O(1) | Θ(log n) |
| Merge sort | 2 | 2 | O(n) | Θ(n log n) |
| Quick sort (avg) | 2 | 2 | O(n) | Θ(n log n) |
| Strassen | 7 | 2 | O(n²) | Θ(n2.81) |
| Karatsuba | 3 | 2 | O(n) | Θ(n1.585) |
2. Binary Search
Search sorted array by comparing with the middle element and eliminating half.
Master theorem: a=1, b=2, p=log21=0, f=O(1)=Θ(n0) → Case 2 → T(n)=Θ(log n)
Iterations: at most ⌊log2n⌋+1
Comparisons: at most 2log2n+1 (two comparisons per iteration: < and =)
Variations:
First occurrence of x: continue search left after finding x → O(log n)
Lower bound (first ≥x): standard binary search variant → O(log n)
Unbounded/exponential search: first find range 2k, then binary search → O(log ans)
3. Merge Sort
Merge step: O(n) using two pointers on sorted halves
Total comparisons: between n⌊log n⌋ and n⌈log n⌉
Inversions counted: merge sort can count inversions in O(n log n) by counting right-side picks during merge.
Counting inversions: An inversion is (i,j) with i<j but A[i]>A[j]. During merge, whenever an element from the right half is picked before an element of the left half, add the count of remaining left elements to inversion count.
4. Quick Sort — Recurrence Analysis
Worst (always 1:n−1 split): T(n) = T(n−1) + n → Θ(n²)
Expected (random pivot): E[T(n)] = 2n ln n ≈ 1.386 n log2n
k:1−k split (0<k<1): T(n) = T(kn)+T((1−k)n)+n
By recursion tree: depth = log1/kn, each level costs n → Θ(n log n) for any fixed k.
5. Strassen Matrix Multiplication
Multiplying two n×n matrices. Naive: 8 multiplications of n/2×n/2 blocks.
Strassen: T(n) = 7T(n/2) + O(n²) → a=7, p=log27≈2.807, f=O(n²) → Case 1
T(n) = Θ(nlog27) ≈ Θ(n2.81)
Strassen computes 7 products M1–M7 using additions/subtractions to eliminate 1 multiplication.
Current best: Williams et al. ~O(n2.37), not examinable in GATE.
6. Karatsuba Integer Multiplication
Multiply two n-digit integers. Split each into two n/2-digit halves.
Karatsuba trick: Need ac, bd, and (a+b)(c+d)−ac−bd = ad+bc. Only 3 multiplications!
T(n) = 3T(n/2) + O(n) → a=3, p=log23≈1.585, f=O(n) → Case 1
T(n) = Θ(nlog23) ≈ Θ(n1.585)
7. Closest Pair of Points
Algorithm:
1. Sort by x-coordinate: O(n log n)
2. Divide at median x. Let δ = min(dist in left, dist in right).
3. Consider strip of width 2δ around dividing line.
4. For each point in strip, check at most 7 others (bounded by δ×2δ box argument).
5. Combine: O(n)
Recurrence: T(n) = 2T(n/2) + O(n) (if y-sorted) → Θ(n log n)
Or T(n) = 2T(n/2) + O(n log n) (if re-sort by y each time) → Θ(n log²n)
8. Maximum Subarray (Kadane vs D&C)
Crossing subarray: max suffix of left + max prefix of right, computed in O(n).
T(n) = 2T(n/2) + O(n) → Θ(n log n)
Kadane’s algorithm (DP): O(n) time, O(1) space — strictly better.
D&C version illustrates the paradigm but is not optimal here.
9. GATE Examples
p = log27 ≈ 2.81. f(n) = 18n² = O(n2.81−ε). Case 1: T(n) = Θ(nlog27).
Answer: ⌊log2n⌋ + 1 (including equality check at each step). For n=8: 4 comparisons.
p = log23 ≈ 1.585. f(n)=n=O(n1.585−ε). Case 1: T(n)=Θ(nlog23).
10. Common Mistakes
- Forgetting the combine step in recurrence: T(n)=2T(n/2) is wrong for merge sort; it must be T(n)=2T(n/2)+n.
- Strassen saves 1 multiplication per level: 7 multiplications vs 8 — not 7 multiplications total; the saving is at each recursive level.
- Binary search on unsorted array: Binary search requires sorted input. Applying it to unsorted arrays gives incorrect results.
- Quick sort worst case on sorted input: Using the first element as pivot on an already-sorted array always creates 1:n−1 partitions, giving Θ(n²).
- Closest pair checking only 1 point per strip point: Must check up to 7 points (not 1 or unlimited); the geometric argument limits the count.
11. FAQ
- What are the three steps of divide and conquer?
- Divide: split problem into a subproblems each of size n/b. Conquer: solve each subproblem recursively (base case solved directly). Combine: merge the solutions. The recurrence T(n)=aT(n/b)+f(n) captures all three, where f(n) is the divide+combine cost.
- Why is Strassen’s algorithm faster than naive matrix multiplication?
- Naive multiplication of block matrices requires 8 recursive multiplications, giving T(n)=8T(n/2)+O(n²)=Θ(n³). Strassen reformulates to need only 7 multiplications (with more additions), giving T(n)=7T(n/2)+O(n²)=Θ(n2.81). Each reduction of 1 multiplication per level compounds exponentially across recursion levels.
- What is the time complexity of binary search?
- T(n)=T(n/2)+O(1). By Master theorem: a=1, b=2, p=log21=0. f(n)=O(1)=Θ(n0). Case 2 (k=0): T(n)=Θ(n0log n)=Θ(log n).
- How does the closest pair of points algorithm achieve O(n log n)?
- Divide by median x. Solve each half. Combine: find all points within δ of dividing line (the strip). For each strip point, only the 7 points in a δ×2δ box need checking (geometry proves at most 8 points fit). With pre-sorting by y, combine is O(n), giving T(n)=2T(n/2)+O(n)=Θ(n log n).