How to Implement a Binary Search Tree — Insert, Delete and Traversal Explained
A complete, step-by-step guide with code in C, Python and Java — covering every operation that GATE CS and university exams test you on.
Last updated: April 12, 2026 | Reading time: ~25 minutes
What You Will Learn in This Article
- Exactly what a Binary Search Tree (BST) is and why its ordering rule makes it powerful.
- How to implement a BST node and the tree class from scratch in C, Python, and Java.
- Insert operation — algorithm, trace, code and complexity.
- Delete operation — all three cases (leaf / one child / two children) with clear diagrams and code.
- Four traversals — Inorder, Preorder, Postorder, and Level-Order — with example outputs.
- Three GATE CS level worked problems with detailed solutions.
- Five common BST mistakes that cost students marks in exams.
1. What Is a Binary Search Tree?
A Binary Search Tree (BST) is a special kind of binary tree where every node follows one simple but powerful rule:
• Every value in the left subtree of N is strictly less than N’s value.
• Every value in the right subtree of N is strictly greater than N’s value.
• Both the left and right subtrees are themselves valid BSTs.
This rule makes search incredibly fast — at each node you can eliminate half the remaining nodes, just like binary search on a sorted array. That is where the name comes from.
1.1 The BST Property Illustrated
Consider this BST with root 50:
50
/ \
30 70
/ \ / \
20 40 60 80
Check the rule at root 50: all nodes on the left (20, 30, 40) are less than 50 ✓; all nodes on the right (60, 70, 80) are greater than 50 ✓. The same rule holds at every subtree root — 30, 70, 20, 40, 60, 80.
1.2 BST vs. Plain Binary Tree
| Feature | Binary Tree | Binary Search Tree |
|---|---|---|
| Node placement rule | None — any arrangement | Left < Root < Right at every level |
| Search | O(n) — must check all nodes | O(log n) average |
| Sorted output | Needs extra work | Free — just do inorder traversal |
| Insert / Delete | No fixed position | Position determined by BST rule |
| Use cases | Expression trees, Huffman coding | Dictionaries, sets, priority queues, databases |
Real databases (like MySQL’s index structures) and language runtimes (C++ std::map, Java TreeMap) are built on balanced BST variants. Understanding the plain BST is your first step to understanding all of them.
2. Node Structure and Tree Skeleton
Before we can do any operation, we need to define what a BST node looks like. Every node holds three things:
- data — the integer (or any comparable key) stored at this node.
- left — a pointer/reference to the left child (or NULL/None).
- right — a pointer/reference to the right child (or NULL/None).
2.1 Node in C
// Node structure
struct Node {
int data;
struct Node* left;
struct Node* right;
};
// Helper: create a new node with given data
struct Node* newNode(int data) {
struct Node* node = (struct Node*)malloc(sizeof(struct Node));
node->data = data;
node->left = NULL;
node->right = NULL;
return node;
}
2.2 Node in Python
class Node:
def __init__(self, data):
self.data = data
self.left = None
self.right = None
class BST:
def __init__(self):
self.root = None
2.3 Node in Java
class Node {
int data;
Node left, right;
Node(int data) {
this.data = data;
this.left = null;
this.right = null;
}
}
class BST {
Node root;
BST() {
root = null;
}
}
The tree itself only needs to store one thing: a reference to the root node. Everything else hangs from it.
3. Insert Operation
3.1 Algorithm
Inserting a value is like searching for where it would be — and then placing it there. Here is the algorithm written as plain English:
1. If the tree is empty, the new node becomes the root. Done.
2. Compare the value to insert with the current node’s data.
3. If value < current.data → move to the left child. If left child is NULL, insert here.
4. If value > current.data → move to the right child. If right child is NULL, insert here.
5. If value == current.data → duplicate. Most BSTs either reject it or allow it on one side (convention varies). We reject duplicates in this guide.
6. Repeat from step 2 at the new current node.
3.2 Step-by-Step Trace
Let us insert the values 50, 30, 70, 20, 40, 60, 80 one by one and watch the tree grow.
| Step | Value Inserted | Comparison Path | Tree After Insert |
|---|---|---|---|
| 1 | 50 | Tree empty → root | 50 |
| 2 | 30 | 30 < 50 → go left → NULL → insert | 50 / 30 |
| 3 | 70 | 70 > 50 → go right → NULL → insert | 50 / \ 30 70 |
| 4 | 20 | 20 < 50 → left (30); 20 < 30 → left → NULL → insert | 50 / \ 30 70 / 20 |
| 5 | 40 | 40 < 50 → left (30); 40 > 30 → right → NULL → insert | 50 / \ 30 70 / \ 20 40 |
| 6 | 60 | 60 > 50 → right (70); 60 < 70 → left → NULL → insert | 50
/ \
30 70
/ \ /
20 40 60 |
| 7 | 80 | 80 > 50 → right (70); 80 > 70 → right → NULL → insert | 50
/ \
30 70
/ \ / \
20 40 60 80 |
3.3 Code — C, Python and Java
C (recursive):
struct Node* insert(struct Node* root, int data) {
// Base case: empty spot found — place the new node here
if (root == NULL)
return newNode(data);
if (data < root->data)
root->left = insert(root->left, data);
else if (data > root->data)
root->right = insert(root->right, data);
// If data == root->data: duplicate, do nothing
return root; // Return (possibly unchanged) root
}
// Usage
int main() {
struct Node* root = NULL;
root = insert(root, 50);
root = insert(root, 30);
root = insert(root, 70);
root = insert(root, 20);
root = insert(root, 40);
root = insert(root, 60);
root = insert(root, 80);
return 0;
}
Python (recursive):
class BST:
def __init__(self):
self.root = None
def _insert(self, node, data):
if node is None:
return Node(data) # Empty slot — create node here
if data < node.data:
node.left = self._insert(node.left, data)
elif data > node.data:
node.right = self._insert(node.right, data)
return node # Return unchanged node if duplicate
def insert(self, data):
self.root = self._insert(self.root, data)
# Usage
bst = BST()
for val in [50, 30, 70, 20, 40, 60, 80]:
bst.insert(val)
Java (recursive):
class BST {
Node root;
Node insert(Node root, int data) {
if (root == null)
return new Node(data); // Empty slot
if (data < root.data)
root.left = insert(root.left, data);
else if (data > root.data)
root.right = insert(root.right, data);
return root;
}
void insert(int data) {
root = insert(root, data);
}
}
3.4 Complexity of Insert
• Best case: O(1) — inserting into an empty tree.
• Average case: O(log n) — tree is roughly balanced; we traverse half the nodes at each level.
• Worst case: O(n) — tree is completely skewed (like inserting 1, 2, 3, 4… in order); every insert traverses the entire chain.
Space Complexity: O(h) for the recursion call stack, where h is the tree height. Average O(log n), worst O(n).
4. Search Operation
Searching in a BST is the simplest operation to understand because you always know which direction to go. You never need to look at both sides at once.
1. If root is NULL → value not found. Return false.
2. If root.data == target → found! Return true.
3. If target < root.data → search in left subtree.
4. If target > root.data → search in right subtree.
C:
struct Node* search(struct Node* root, int key) {
if (root == NULL || root->data == key)
return root; // NULL means not found; root means found
if (key < root->data)
return search(root->left, key);
return search(root->right, key);
}
Python:
def search(node, key):
if node is None or node.data == key:
return node # None → not found; node → found
if key < node.data:
return search(node.left, key)
return search(node.right, key)
Trace — searching for 40 in the tree {50,30,70,20,40,60,80}:
At 50: 40 < 50 → go left At 30: 40 > 30 → go right At 40: 40 == 40 → FOUND! (3 comparisons, not 7)
This is the power of BST: on a balanced tree with 1,000,000 nodes, you need at most 20 comparisons to find any value — because log₂(1,000,000) ≈ 20.
5. Delete Operation — All Three Cases
Deletion is the trickiest BST operation because removing a node can break the BST ordering property. The fix depends on how many children the node-to-delete has.
5.1 Case 1 — Deleting a Leaf Node (No Children)
A leaf node has no children. Removing it cannot affect any other part of the tree.
Example: Delete 20 from {50,30,70,20,40,60,80}.
Before: After:
50 50
/ \ / \
30 70 30 70
/ \ / \ \ / \
20 40 60 80 40 60 80
Action: simply remove the node. Set parent's left pointer to NULL.
5.2 Case 2 — Node with Exactly One Child
The node to delete has only one child (either left or right). You "bypass" the node by linking its parent directly to its child.
Example: Delete 30 when the tree has {50,30,70,40,60,80} (30 has only right child 40).
Before: After:
50 50
/ \ / \
30 70 40 70
\ / \ / \
40 60 80 60 80
Action: 50's left pointer (which pointed to 30) now points to 40.
Node 30 is freed from memory.
5.3 Case 3 — Node with Two Children (The Tricky Case)
When the node has both a left and right child, you cannot simply remove it — the tree would split in two. The standard approach is:
1. Find the node's inorder successor — the smallest value in its right subtree (go right once, then keep going left until you hit NULL).
2. Copy the inorder successor's data into the node you want to delete.
3. Delete the inorder successor from the right subtree (it will have at most one child — a right child — so this is Case 1 or Case 2, which we already know how to handle).
Why inorder successor? Because the inorder successor is the smallest value larger than the node being deleted. Replacing the deleted node's value with it preserves the BST property: it is still greater than everything in the left subtree and still smaller than or equal to everything else in the right subtree.
Example: Delete 50 (the root) from {50,30,70,20,40,60,80}.
Step 1 — Find inorder successor of 50:
Go right → 70. Then go left → 60. 60 has no left child.
Inorder successor = 60.
Step 2 — Replace 50's data with 60:
60 ← was 50, now replaced with 60
/ \
30 70
/ \ / \
20 40 [60] 80 ← 60 is now duplicated here (temporarily)
Step 3 — Delete the original 60 from the right subtree:
60 (at old position) is a leaf → Case 1, just remove it.
After:
60
/ \
30 70
/ \ \
20 40 80
Alternative: Some implementations use the inorder predecessor (largest value in the left subtree) instead. Both are correct; they produce different but valid BSTs.
5.4 Full Delete Code
C (all three cases):
// Helper: find the node with the minimum value in a subtree
struct Node* minValueNode(struct Node* node) {
struct Node* current = node;
while (current->left != NULL)
current = current->left;
return current;
}
struct Node* deleteNode(struct Node* root, int key) {
if (root == NULL)
return root; // Key not found
// Navigate to the correct position
if (key < root->data)
root->left = deleteNode(root->left, key);
else if (key > root->data)
root->right = deleteNode(root->right, key);
else {
// --- root is the node to delete ---
// Case 1 & 2: Zero or one child
if (root->left == NULL) {
struct Node* temp = root->right;
free(root);
return temp; // Replace node with its right child (or NULL)
}
if (root->right == NULL) {
struct Node* temp = root->left;
free(root);
return temp; // Replace node with its left child
}
// Case 3: Two children — find inorder successor
struct Node* temp = minValueNode(root->right);
// Copy successor's data into this node
root->data = temp->data;
// Delete the successor from the right subtree
root->right = deleteNode(root->right, temp->data);
}
return root;
}
Python (all three cases):
def _min_node(node):
"""Returns the node with the smallest data in a subtree."""
current = node
while current.left is not None:
current = current.left
return current
def delete(node, key):
if node is None:
return node # Key not found
if key < node.data:
node.left = delete(node.left, key)
elif key > node.data:
node.right = delete(node.right, key)
else:
# This node must be deleted
if node.left is None: # Case 1 or 2 (no left child)
return node.right
if node.right is None: # Case 2 (no right child)
return node.left
# Case 3: two children
successor = _min_node(node.right)
node.data = successor.data # Copy successor's value
node.right = delete(node.right, successor.data) # Delete successor
return node
# In BST class:
# self.root = delete(self.root, key)
Java (all three cases):
Node minNode(Node root) {
Node current = root;
while (current.left != null)
current = current.left;
return current;
}
Node delete(Node root, int key) {
if (root == null) return null;
if (key < root.data)
root.left = delete(root.left, key);
else if (key > root.data)
root.right = delete(root.right, key);
else {
if (root.left == null) return root.right; // Case 1 or 2
if (root.right == null) return root.left; // Case 2
// Case 3: two children
Node successor = minNode(root.right);
root.data = successor.data;
root.right = delete(root.right, successor.data);
}
return root;
}
6. BST Traversals
A traversal visits every node in the tree in a specific order. We will use the tree {50,30,70,20,40,60,80} for all examples:
50
/ \
30 70
/ \ / \
20 40 60 80
6.1 Inorder Traversal — Left → Root → Right
Inorder is the most important BST traversal because it visits nodes in ascending sorted order. This is true for any BST — it follows directly from the BST property.
Use cases: Printing all values sorted; finding the k-th smallest element; verifying a tree is a valid BST.
C:
void inorder(struct Node* root) {
if (root == NULL) return;
inorder(root->left); // 1. Visit left subtree
printf("%d ", root->data); // 2. Process this node
inorder(root->right); // 3. Visit right subtree
}
Python:
def inorder(node):
if node is None:
return
inorder(node.left)
print(node.data, end=" ")
inorder(node.right)
6.2 Preorder Traversal — Root → Left → Right
Preorder visits the current node before its children. The root is always printed first.
Use cases: Copying or serializing a tree (the preorder sequence can be used to reconstruct the BST exactly). Also used in expression trees.
void preorder(struct Node* root) {
if (root == NULL) return;
printf("%d ", root->data); // 1. Process this node first
preorder(root->left); // 2. Visit left subtree
preorder(root->right); // 3. Visit right subtree
}
6.3 Postorder Traversal — Left → Right → Root
Postorder visits both children before the current node. The root is always last.
Use cases: Deleting a tree from memory (must delete children before parent); evaluating expression trees; computing directory sizes (must process subdirectories before parent).
void postorder(struct Node* root) {
if (root == NULL) return;
postorder(root->left); // 1. Visit left subtree
postorder(root->right); // 2. Visit right subtree
printf("%d ", root->data); // 3. Process this node last
}
6.4 Level-Order Traversal (BFS)
Level-order visits nodes level by level, left to right. It uses a queue instead of recursion.
Use cases: Printing the tree level by level; finding the height of the tree iteratively; finding the closest ancestor between two nodes.
Python (using collections.deque):
from collections import deque
def level_order(root):
if root is None:
return
queue = deque([root])
while queue:
node = queue.popleft()
print(node.data, end=" ") # Process current node
if node.left:
queue.append(node.left)
if node.right:
queue.append(node.right)
C (using an array-based queue simulation):
#define MAX 100
void levelOrder(struct Node* root) {
if (root == NULL) return;
struct Node* queue[MAX];
int front = 0, rear = 0;
queue[rear++] = root;
while (front < rear) {
struct Node* node = queue[front++];
printf("%d ", node->data);
if (node->left != NULL) queue[rear++] = node->left;
if (node->right != NULL) queue[rear++] = node->right;
}
}
Traversal Summary Table
| Traversal | Order | Output (for our tree) | Key Use Case |
|---|---|---|---|
| Inorder | Left → Root → Right | 20 30 40 50 60 70 80 | Sorted output, BST validation |
| Preorder | Root → Left → Right | 50 30 20 40 70 60 80 | Serialization, copying tree |
| Postorder | Left → Right → Root | 20 40 30 60 80 70 50 | Deletion, expression evaluation |
| Level-Order | Level by level (BFS) | 50 30 70 20 40 60 80 | Shortest path, breadth-first tasks |
7. Complete Complexity Table
| Operation | Best Case | Average Case | Worst Case | Space (Call Stack) |
|---|---|---|---|---|
| Insert | O(1) | O(log n) | O(n) | O(h) |
| Search | O(1) | O(log n) | O(n) | O(h) |
| Delete | O(1) | O(log n) | O(n) | O(h) |
| Inorder | O(n) | O(n) | O(n) | O(h) |
| Preorder | O(n) | O(n) | O(n) | O(h) |
| Postorder | O(n) | O(n) | O(n) | O(h) |
| Level-Order | O(n) | O(n) | O(n) | O(w) — max width |
• h = height of tree. For a balanced BST: h = O(log n). For a skewed BST: h = O(n).
• w = maximum width of tree. For a complete BST: w = O(n/2) = O(n) at the last level.
• Worst case O(n) for Insert/Search/Delete happens when inserting already-sorted or reverse-sorted data into a plain BST.
• To guarantee O(log n) always: use AVL Trees or Red-Black Trees.
8. GATE CS Level Worked Examples
Example 1 — GATE 2019 (DS) Type
Solution:
Step 1 — Build the tree by inserting each value:
Insert 10 → root
Insert 5 → 5 < 10 → left of 10
Insert 20 → 20 > 10 → right of 10
Insert 3 → 3 < 10 → left; 3 < 5 → left of 5
Insert 7 → 7 < 10 → left; 7 > 5 → right of 5
Insert 15 → 15 > 10 → right; 15 < 20 → left of 20
Insert 25 → 25 > 10 → right; 25 > 20 → right of 20
Final tree:
10
/ \
5 20
/ \ / \
3 7 15 25
Step 2 — Inorder (Left → Root → Right): 3, 5, 7, 10, 15, 20, 25
Step 3 — Height: longest path from root to leaf = 10 → 5 → 3 (or 10→5→7 or 10→20→15 or 10→20→25) = 2 edges = height 2. (If height is counted as number of nodes: 3.)
Inorder: 3 5 7 10 15 20 25 (sorted — always true for BST)
Height (edges): 2 | Height (nodes): 3
Example 2 — Delete and Verify BST Property
15
/ \
10 20
/ \
5 12
Solution:
Node 10 has two children (5 and 12) → we are in Case 3.
Step 1: Find inorder successor of 10.
Go to right child of 10 = 12.
12 has no left child → inorder successor = 12.
Step 2: Copy 12 into node 10's position. Tree temporarily:
15
/ \
12 20
/ \
5 12 ← duplicate, will be deleted next
Step 3: Delete the original 12 from the right subtree of (now-12) node.
12 is a leaf → Case 1, just remove it.
Final tree:
15
/ \
12 20
/
5
Verify BST property:
- At root 15: left subtree (12, 5) all < 15 ✓; right subtree (20) all > 15 ✓
- At node 12: left child (5) < 12 ✓; no right child ✓
Example 3 — Worst Case BST and Skewness
Solution:
Insert 1 → root
Insert 2 → 2 > 1 → right of 1
Insert 3 → 3 > 1 → right; 3 > 2 → right of 2
Insert 4 → ... right of 3
Insert 5 → ... right of 4
Tree:
1
\
2
\
3
\
4
\
5
This is a right-skewed BST — the absolute worst case. It looks like a linked list.
Height: 4 edges (or 5 nodes) — maximum possible for 5 elements.
Time to search for 5: must visit 1→2→3→4→5 = 5 comparisons = O(n).
Compare with balanced BST: height would be 2 (for 5 nodes), search = O(log 5) ≈ 3 comparisons.
This is why data that arrives in sorted order kills BST performance — and why balanced BSTs (AVL, Red-Black) exist. See the Advanced Trees page for how AVL trees fix this.
9. Five Common Mistakes in BST Implementation
| # | Mistake | What Goes Wrong | The Fix |
|---|---|---|---|
| 1 | Forgetting to return the root pointer from insert | In C/Java, if the insert function does not return the modified subtree root, the parent's pointer is never updated. Insertions appear to silently succeed but the tree is never modified. | Always return the root node from the recursive insert function and assign it back: root->left = insert(root->left, data); |
| 2 | Confusing inorder successor with the right child | During two-child deletion, beginners often just replace the deleted node with its right child. This works only if the right child has no left subtree — it breaks the BST otherwise. | Always find the minimum of the right subtree by going left until you hit NULL. That is the true inorder successor. |
| 3 | Not handling the NULL root case in traversal | Calling inorder(root->left) when root is already NULL causes a null-pointer crash or segmentation fault. | Always add the base case first: if (root == NULL) return; at the start of every recursive traversal function. |
| 4 | Inserting sorted data without realising it creates a skewed tree | If your input data is already sorted (common in real programs — reading IDs from a database), your BST becomes O(n) for all operations. Students are surprised when their "fast" BST is slow. | Shuffle data before insertion, or use a self-balancing BST. Check height after building: if height ≈ n, the tree is skewed. |
| 5 | Memory leak after deletion in C | In Case 1 and Case 2 deletion, the removed node is bypassed and its pointer is returned to the parent — but free() is never called on it. Over many deletions, this leaks memory. | Store the pointer to the node being removed in a temp variable, return its child to the parent, and then call free(temp) before returning. |
10. Frequently Asked Questions
Q1. What is the time complexity of insert, delete and search in a BST?
Average case is O(log n) for all three operations when the tree is balanced. This is because the BST property lets you eliminate half the remaining nodes at each level — exactly like binary search on a sorted array. However, worst case degrades to O(n) when the tree becomes a skewed chain (happens when you insert already-sorted data into a plain BST). Self-balancing variants like AVL or Red-Black trees guarantee O(log n) in all cases.
Q2. Why does deletion in a BST have three cases?
Because the node to be deleted can be in three structural situations: (1) It is a leaf — simply remove it, no pointers need to be redirected. (2) It has exactly one child — splice the child in its place; the BST property is still satisfied because the child's subtree was already correctly ordered relative to the deleted node. (3) It has two children — you cannot just remove it without splitting the tree. The solution is to copy the inorder successor's value into the node and delete the successor instead. The successor is always in the simpler Case 1 or Case 2 situation, so this avoids infinite recursion.
Q3. Which traversal of a BST gives elements in sorted order?
Inorder traversal (Left → Root → Right) always visits BST nodes in non-decreasing sorted order. This follows directly from the BST property: by the time you visit a node, you have already visited everything in its left subtree (all smaller), and you will visit everything in its right subtree next (all larger). This property is the reason BSTs are so useful — sorted access is free, with no extra sorting step needed.
Q4. What happens to a BST if you insert already sorted data?
Inserting elements in sorted order (e.g., 10, 20, 30, 40, ...) causes every new node to always go to the right child because each new value is larger than all previous ones. The tree degenerates into a right-skewed linked list with height n−1. All operations then take O(n) time. The reverse also happens with reverse-sorted data — you get a left-skewed tree. The fix is either to shuffle the input before inserting, or to use a self-balancing BST like an AVL tree or Red-Black tree, which automatically re-balances after every insert and delete to keep height at O(log n).