Skip to main content
Data StructuresAdvanced20 min readUpdated March 2025

AVL Trees

AVL trees are self-balancing binary search trees that maintain a height balance factor of at most 1 for every node. They guarantee O(log n) search, insert, and delete operations by performing rotations after each modification.

The Problem with Unbalanced BSTs

A regular BST can degenerate into a linked list when elements are inserted in sorted order. For example, inserting 1, 2, 3, 4, 5 creates a right-skewed tree with height n, making all operations O(n).

AVL trees (named after Adelson-Velsky and Landis, 1962) solve this by maintaining a balance factor for every node:

balance_factor = height(left_subtree) - height(right_subtree)

An AVL tree requires that the balance factor of every node is -1, 0, or +1. After any insertion or deletion that violates this property, the tree performs rotations to restore balance.

The Four Rotation Cases

When the balance factor becomes +2 or -2, one of four rotation cases applies:

  • Left-Left (LL) Case — Node is right-heavy, right child is right-heavy or balanced. Fix: Single left rotation.
  • Right-Right (RR) Case — Node is left-heavy, left child is left-heavy or balanced. Fix: Single right rotation.
  • Left-Right (LR) Case — Node is right-heavy, right child is left-heavy. Fix: Right rotation on right child, then left rotation on node.
  • Right-Left (RL) Case — Node is left-heavy, left child is right-heavy. Fix: Left rotation on left child, then right rotation on node.

AVL Tree Implementation

A complete AVL tree with insert and rotations:

python
class AVLNode:
    def __init__(self, val):
        self.val = val
        self.left = None
        self.right = None
        self.height = 1

class AVLTree:
    def _height(self, node):
        return node.height if node else 0

    def _balance_factor(self, node):
        return self._height(node.left) - self._height(node.right) if node else 0

    def _update_height(self, node):
        node.height = 1 + max(self._height(node.left), self._height(node.right))

    def _rotate_right(self, z):
        """Right rotation around z"""
        y = z.left
        T3 = y.right
        y.right = z
        z.left = T3
        self._update_height(z)
        self._update_height(y)
        return y   # New root of this subtree

    def _rotate_left(self, z):
        """Left rotation around z"""
        y = z.right
        T2 = y.left
        y.left = z
        z.right = T2
        self._update_height(z)
        self._update_height(y)
        return y   # New root of this subtree

    def _rebalance(self, node):
        self._update_height(node)
        bf = self._balance_factor(node)

        # Left-Left Case
        if bf > 1 and self._balance_factor(node.left) >= 0:
            return self._rotate_right(node)

        # Left-Right Case
        if bf > 1 and self._balance_factor(node.left) < 0:
            node.left = self._rotate_left(node.left)
            return self._rotate_right(node)

        # Right-Right Case
        if bf < -1 and self._balance_factor(node.right) <= 0:
            return self._rotate_left(node)

        # Right-Left Case
        if bf < -1 and self._balance_factor(node.right) > 0:
            node.right = self._rotate_right(node.right)
            return self._rotate_left(node)

        return node   # Already balanced

    def insert(self, root, val):
        if not root:
            return AVLNode(val)
        if val < root.val:
            root.left = self.insert(root.left, val)
        elif val > root.val:
            root.right = self.insert(root.right, val)
        else:
            return root   # Duplicate
        return self._rebalance(root)

# Test: Insert sorted data (would degenerate in plain BST)
avl = AVLTree()
root = None
for val in [1, 2, 3, 4, 5, 6, 7]:
    root = avl.insert(root, val)

print(f"Root: {root.val}")          # 4 (balanced!)
print(f"Height: {root.height}")     # 3 (log2(7) ~ 2.8)

AVL vs Red-Black Trees

Both AVL and Red-Black trees are self-balancing BSTs, but they make different tradeoffs:

  • AVL trees are more strictly balanced (height difference <= 1). Faster lookups because the tree is shorter.
  • Red-Black trees allow slightly more imbalance (height <= 2*log n). Fewer rotations on insert/delete — faster writes.
  • AVL trees are preferred for read-heavy workloads (databases with frequent queries).
  • Red-Black trees are preferred for write-heavy workloads. Used in Java TreeMap, C++ std::map, Linux kernel.
  • Both guarantee O(log n) for all operations in the worst case.

Key Takeaways

  • AVL trees maintain balance factor (-1, 0, +1) at every node, guaranteeing O(log n) height.
  • Four rotation cases (LL, RR, LR, RL) restore balance after insertions and deletions.
  • AVL trees are more strictly balanced than Red-Black trees — better for read-heavy workloads.
  • Each rotation is O(1) — only a constant number of pointer changes are needed.
  • In practice, use language-provided balanced BSTs (Java TreeMap, Python sortedcontainers) rather than implementing from scratch.

Contact Us

Have a question or feedback? Fill out the form below or reach us directly at support@nvaitraining.com