Trees are fundamental data structures used to organise data for efficient search, insertion, and deletion. Real-world data sets change continuously; simple binary search trees can become skewed as elements are inserted or deleted, causing poor worst-case performance. A family of trees called height-balanced trees keeps tree height small by performing local adjustments during updates. An important example of a height-balanced binary search tree is the AVL tree.
Threaded binary trees are a different technique where NULL child links are replaced by special links (threads) to support traversals without recursion or an explicit stack. The AVL tree, by contrast, maintains balance by monitoring subtree heights and performing rotations when necessary. This chapter explains AVL trees: definition, balance factor, rotations, insertion and deletion algorithms, representation, complexity, and applications.
An AVL tree is a binary search tree (BST) in which, for every node, the heights of its left and right subtrees differ by at most one. The technique was proposed by M. M. Adelson-Velskii and E. M. Landis (commonly cited together as Adelson-Velskii and Landis) in the early 1960s, and hence the name AVL.
Formally, let T be a non-empty binary tree with left subtree TL and right subtree TR. Tree T is height-balanced (an AVL tree) if:
The balance factor of a node is defined as:
balanceFactor(node) = height(node→left) - height(node→right)
The balance factor may take values -1, 0, or +1 for a balanced node. When the balance factor becomes less than -1 or greater than +1 for some node after an update, the subtree rooted at that node is unbalanced and must be rebalanced using rotations.
Consider inserting keys 1, 2, 3, 4, 5, 6, 7 into an ordinary BST in ascending order: the BST becomes a single long chain (linked list), producing O(n) search time. An AVL tree for the same insertion sequence will perform rotations to remain balanced and require far fewer comparisons for searches.

Each node in an AVL tree typically stores the key and pointers to left and right children. To support efficient balance updates, nodes also store the height of the subtree rooted at that node (or alternately the balance factor directly).
A typical node structure (illustrative, C-like) is:

The height of a NULL pointer is taken as 0 (or -1 in some conventions). Using the convention height(NULL) = 0, the height of a node is:
height(node) = 1 + max(height(node→left), height(node→right))
After any insertion or deletion, update heights for nodes on the path from the changed node up to the root and compute the balance factor for each node as:
balanceFactor(node) = height(node→left) - height(node→right)
If balanceFactor(node) ∈ {-1, 0, +1} the node is balanced. If it is outside this range, the node is unbalanced and requires rotations.
Rotations are local operations that restore the AVL property for an unbalanced subtree. There are four cases, classified by the direction of the imbalance:
Rotation details (single right rotation shown in words):
All rotations preserve the binary search tree order and only rearrange pointers locally. Heights of only a few nodes change, so rotations are efficient O(1) operations.
Insertion in an AVL tree proceeds as in a BST followed by rebalancing on the path from the insertion point back to the root.
Example (conceptual): inserting keys 1, 2, 3 into an empty AVL tree.
Insert 1: becomes root.
Insert 2: goes to the right of 1; heights updated; balance factors remain within limits.
Insert 3: goes to the right of 2; node 1 has balance factor -2 (right-heavy) with right child having balance factor -1 ⇒ RR case ⇒ perform a left rotation at node 1. After rotation, 2 becomes root of subtree with 1 as left child and 3 as right child.
Deletion in an AVL tree follows BST deletion semantics, then rebalances the tree on the path from the deletion point upward.
Note: After deletion, a subtree's height can decrease, so multiple ancestors may become unbalanced and need correction.
An AVL tree is a self-balancing binary search tree that enforces the invariant that heights of the left and right subtrees of every node differ by at most one. Balance is measured by the balance factor, and local rotations (LL, RR, LR, RL) restore balance after insertions and deletions. AVL trees guarantee O(log n) worst-case time for search, insert and delete, making them suitable for applications requiring efficient and predictable lookups.
| 1. What is an AVL Tree? | ![]() |
| 2. What are the advantages of AVL Trees? | ![]() |
| 3. How is the height and balance factor calculated in an AVL Tree? | ![]() |
| 4. What are the types of rotations used in rebalancing an AVL Tree? | ![]() |
| 5. What are the time and space complexities of AVL Tree operations? | ![]() |
![]() | Explore Courses for Computer Science Engineering (CSE) exam |
![]() | Get EduRev Notes directly in your Google search |