Algorithms — GATE CS Formula Sheet
All key complexities, recurrences, and results consolidated for last-minute GATE revision.
Last updated: April 2026 | GATE CS 2024–2026 syllabus
How to Use This Sheet
- Revise 24–48 hours before the exam for rapid recall.
- Cover each section and attempt to recall values before looking.
- Pay special attention to edge cases: negative edges, NP reductions, amortised costs.
1. Asymptotic Notations
Growth order: O(1) < O(log n) < O(√n) < O(n) < O(n log n) < O(n²) < O(2n) < O(n!)
Limit: lim f/g = 0 ⇒ f=o(g) lim f/g = c>0 ⇒ f=Θ(g) lim f/g = ∞ ⇒ f=ω(g)
log(n!) = Θ(n log n) n! = o(nn) logkn = o(nε) for any k,ε>0
Limit: lim f/g = 0 ⇒ f=o(g) lim f/g = c>0 ⇒ f=Θ(g) lim f/g = ∞ ⇒ f=ω(g)
log(n!) = Θ(n log n) n! = o(nn) logkn = o(nε) for any k,ε>0
2. Master Theorem
T(n) = aT(n/b) + f(n), a≥1, b>1, p = logba
Case 1: f = O(np−ε) ⇒ T(n) = Θ(np)
Case 2: f = Θ(nplogkn) ⇒ T(n) = Θ(nplogk+1n)
Case 3: f = Ω(np+ε) AND af(n/b)≤cf(n) ⇒ T(n) = Θ(f(n))
Case 1: f = O(np−ε) ⇒ T(n) = Θ(np)
Case 2: f = Θ(nplogkn) ⇒ T(n) = Θ(nplogk+1n)
Case 3: f = Ω(np+ε) AND af(n/b)≤cf(n) ⇒ T(n) = Θ(f(n))
| Recurrence | p = logba | Solution | Example |
|---|---|---|---|
| T(n)=T(n/2)+1 | 0 | Θ(log n) | Binary search |
| T(n)=2T(n/2)+n | 1 | Θ(n log n) | Merge sort |
| T(n)=2T(n/2)+1 | 1 | Θ(n) | Tree traversal |
| T(n)=4T(n/2)+n | 2 | Θ(n²) | — |
| T(n)=8T(n/2)+n² | 3 | Θ(n³) | Matrix multiply (naive) |
| T(n)=7T(n/2)+n² | log27≈2.81 | Θ(n2.81) | Strassen |
| T(n)=3T(n/2)+n | log23≈1.585 | Θ(n1.585) | Karatsuba |
3. Special Recurrences
T(n)=T(n−1)+1 ⇒ Θ(n)
T(n)=T(n−1)+n ⇒ Θ(n²) [Insertion sort]
T(n)=2T(n−1)+1 ⇒ Θ(2n) [Tower of Hanoi]
T(n)=T(n−1)+T(n−2) ⇒ Θ(φn) [Naive Fibonacci, φ=(1+√5)/2]
T(n)=T(√n)+1 ⇒ Θ(log log n)
T(n)=2T(n/2)+n/log n ⇒ Θ(n log log n) [Master fails; use Akra-Bazzi]
T(n)=T(n−1)+n ⇒ Θ(n²) [Insertion sort]
T(n)=2T(n−1)+1 ⇒ Θ(2n) [Tower of Hanoi]
T(n)=T(n−1)+T(n−2) ⇒ Θ(φn) [Naive Fibonacci, φ=(1+√5)/2]
T(n)=T(√n)+1 ⇒ Θ(log log n)
T(n)=2T(n/2)+n/log n ⇒ Θ(n log log n) [Master fails; use Akra-Bazzi]
4. Sorting Algorithms
| Algorithm | Best | Average | Worst | Space | Stable |
|---|---|---|---|---|---|
| Bubble sort | O(n) | O(n²) | O(n²) | O(1) | Yes |
| Selection sort | O(n²) | O(n²) | O(n²) | O(1) | No |
| Insertion sort | O(n) | O(n²) | O(n²) | O(1) | Yes |
| Merge sort | O(n log n) | O(n log n) | O(n log n) | O(n) | Yes |
| Quick sort | O(n log n) | O(n log n) | O(n²) | O(log n) | No |
| Heap sort | O(n log n) | O(n log n) | O(n log n) | O(1) | No |
| Counting sort | O(n+k) | O(n+k) | O(n+k) | O(k) | Yes |
| Radix sort | O(d(n+k)) | O(d(n+k)) | O(d(n+k)) | O(n+k) | Yes |
Lower bound for comparison sorts: Ω(n log n) (decision tree, n! leaves).
Selection sort: minimum swaps (n−1). Insertion sort: inversions = shifts = O(n + I).
Selection sort: minimum swaps (n−1). Insertion sort: inversions = shifts = O(n + I).
5. Graph Algorithm Complexities
| Algorithm | Time | Space | Notes |
|---|---|---|---|
| BFS / DFS | O(V+E) | O(V) | Adj. list |
| Dijkstra (binary heap) | O((V+E) log V) | O(V) | Non-neg weights |
| Dijkstra (Fib heap) | O(E + V log V) | O(V) | — |
| Bellman-Ford | O(VE) | O(V) | Neg edges OK |
| Floyd-Warshall | O(V³) | O(V²) | All-pairs |
| Kruskal | O(E log E) | O(V) | MST |
| Prim (heap) | O(E log V) | O(V) | MST |
| Topological sort | O(V+E) | O(V) | DAG only |
| Kosaraju / Tarjan | O(V+E) | O(V) | SCC |
| Edmonds-Karp | O(VE²) | O(V²) | Max flow |
6. Dynamic Programming Reference
| Problem | Time | Space |
|---|---|---|
| LCS (length m, n) | O(mn) | O(mn) or O(min(m,n)) |
| LIS (DP) | O(n²) | O(n) |
| LIS (patience sort) | O(n log n) | O(n) |
| 0-1 Knapsack | O(nW) | O(W) |
| Matrix Chain Mult. | O(n³) | O(n²) |
| Edit distance | O(mn) | O(mn) |
| Coin change (min) | O(n·W) | O(W) |
| Subset sum | O(n·S) | O(S) |
| Optimal BST | O(n³) | O(n²) |
7. Greedy Algorithm Complexities
Activity selection: O(n log n) Fractional knapsack: O(n log n)
Huffman coding: O(n log n) Job scheduling (EDF): O(n log n)
Dijkstra: O((V+E) log V) Kruskal: O(E log V) Prim: O(E log V)
Huffman coding: O(n log n) Job scheduling (EDF): O(n log n)
Dijkstra: O((V+E) log V) Kruskal: O(E log V) Prim: O(E log V)
8. NP-Completeness Quick Reference
P ⊆ NP. P = NP is open. NP-complete = NP ∩ NP-hard.
If ANY NP-complete problem ∈ P ⇒ P = NP.
If ANY NP-complete problem ∈ P ⇒ P = NP.
Reduction chain: SAT → 3-SAT → Clique → Independent Set ↔ Vertex Cover → Set Cover
3-SAT → Hamiltonian Cycle → TSP
3-SAT → Subset Sum → Knapsack
2-SAT: in P (O(V+E)). 3-SAT: NP-complete.
TSP decision: NP-complete. TSP optimisation: NP-hard.
0-1 Knapsack: NP-complete (decision), NP-hard (optimisation), O(nW) pseudo-poly DP.
9. Amortised Complexities
Dynamic array insert: O(1) amortised
Stack multi-pop: O(1) amortised per op
Binary counter increment: O(1) amortised
Splay tree op: O(log n) amortised
Union-Find with path compression + union by rank: O(α(n)) amortised
Stack multi-pop: O(1) amortised per op
Binary counter increment: O(1) amortised
Splay tree op: O(log n) amortised
Union-Find with path compression + union by rank: O(α(n)) amortised