Algorithms Formula Sheet – GATE CS | EngineeringHulk

Algorithms — GATE CS Formula Sheet

All key complexities, recurrences, and results consolidated for last-minute GATE revision.

Last updated: April 2026  |  GATE CS 2024–2026 syllabus

How to Use This Sheet

  • Revise 24–48 hours before the exam for rapid recall.
  • Cover each section and attempt to recall values before looking.
  • Pay special attention to edge cases: negative edges, NP reductions, amortised costs.

1. Asymptotic Notations

Growth order: O(1) < O(log n) < O(√n) < O(n) < O(n log n) < O(n²) < O(2n) < O(n!)
Limit: lim f/g = 0 ⇒ f=o(g)    lim f/g = c>0 ⇒ f=Θ(g)    lim f/g = ∞ ⇒ f=ω(g)
log(n!) = Θ(n log n)    n! = o(nn)    logkn = o(nε) for any k,ε>0

2. Master Theorem

T(n) = aT(n/b) + f(n), a≥1, b>1, p = logba
Case 1: f = O(np−ε) ⇒ T(n) = Θ(np)
Case 2: f = Θ(nplogkn) ⇒ T(n) = Θ(nplogk+1n)
Case 3: f = Ω(np+ε) AND af(n/b)≤cf(n) ⇒ T(n) = Θ(f(n))
Recurrencep = logbaSolutionExample
T(n)=T(n/2)+10Θ(log n)Binary search
T(n)=2T(n/2)+n1Θ(n log n)Merge sort
T(n)=2T(n/2)+11Θ(n)Tree traversal
T(n)=4T(n/2)+n2Θ(n²)
T(n)=8T(n/2)+n²3Θ(n³)Matrix multiply (naive)
T(n)=7T(n/2)+n²log27≈2.81Θ(n2.81)Strassen
T(n)=3T(n/2)+nlog23≈1.585Θ(n1.585)Karatsuba

3. Special Recurrences

T(n)=T(n−1)+1 ⇒ Θ(n)
T(n)=T(n−1)+n ⇒ Θ(n²)    [Insertion sort]
T(n)=2T(n−1)+1 ⇒ Θ(2n)    [Tower of Hanoi]
T(n)=T(n−1)+T(n−2) ⇒ Θ(φn)    [Naive Fibonacci, φ=(1+√5)/2]
T(n)=T(√n)+1 ⇒ Θ(log log n)
T(n)=2T(n/2)+n/log n ⇒ Θ(n log log n)    [Master fails; use Akra-Bazzi]

4. Sorting Algorithms

AlgorithmBestAverageWorstSpaceStable
Bubble sortO(n)O(n²)O(n²)O(1)Yes
Selection sortO(n²)O(n²)O(n²)O(1)No
Insertion sortO(n)O(n²)O(n²)O(1)Yes
Merge sortO(n log n)O(n log n)O(n log n)O(n)Yes
Quick sortO(n log n)O(n log n)O(n²)O(log n)No
Heap sortO(n log n)O(n log n)O(n log n)O(1)No
Counting sortO(n+k)O(n+k)O(n+k)O(k)Yes
Radix sortO(d(n+k))O(d(n+k))O(d(n+k))O(n+k)Yes
Lower bound for comparison sorts: Ω(n log n) (decision tree, n! leaves).
Selection sort: minimum swaps (n−1). Insertion sort: inversions = shifts = O(n + I).

5. Graph Algorithm Complexities

AlgorithmTimeSpaceNotes
BFS / DFSO(V+E)O(V)Adj. list
Dijkstra (binary heap)O((V+E) log V)O(V)Non-neg weights
Dijkstra (Fib heap)O(E + V log V)O(V)
Bellman-FordO(VE)O(V)Neg edges OK
Floyd-WarshallO(V³)O(V²)All-pairs
KruskalO(E log E)O(V)MST
Prim (heap)O(E log V)O(V)MST
Topological sortO(V+E)O(V)DAG only
Kosaraju / TarjanO(V+E)O(V)SCC
Edmonds-KarpO(VE²)O(V²)Max flow

6. Dynamic Programming Reference

ProblemTimeSpace
LCS (length m, n)O(mn)O(mn) or O(min(m,n))
LIS (DP)O(n²)O(n)
LIS (patience sort)O(n log n)O(n)
0-1 KnapsackO(nW)O(W)
Matrix Chain Mult.O(n³)O(n²)
Edit distanceO(mn)O(mn)
Coin change (min)O(n·W)O(W)
Subset sumO(n·S)O(S)
Optimal BSTO(n³)O(n²)

7. Greedy Algorithm Complexities

Activity selection: O(n log n)    Fractional knapsack: O(n log n)
Huffman coding: O(n log n)    Job scheduling (EDF): O(n log n)
Dijkstra: O((V+E) log V)    Kruskal: O(E log V)    Prim: O(E log V)

8. NP-Completeness Quick Reference

P ⊆ NP. P = NP is open. NP-complete = NP ∩ NP-hard.
If ANY NP-complete problem ∈ P ⇒ P = NP.

Reduction chain: SAT → 3-SAT → Clique → Independent Set ↔ Vertex Cover → Set Cover
                 3-SAT → Hamiltonian Cycle → TSP
                 3-SAT → Subset Sum → Knapsack

2-SAT: in P (O(V+E)). 3-SAT: NP-complete.
TSP decision: NP-complete. TSP optimisation: NP-hard.
0-1 Knapsack: NP-complete (decision), NP-hard (optimisation), O(nW) pseudo-poly DP.

9. Amortised Complexities

Dynamic array insert: O(1) amortised
Stack multi-pop: O(1) amortised per op
Binary counter increment: O(1) amortised
Splay tree op: O(log n) amortised
Union-Find with path compression + union by rank: O(α(n)) amortised