AVL tree in data structure c++
AVL tree is generally known as the self-sustained and most balanced tree in the field of a binary search tree. It was also widely known as the height-balanced binary tree. It was mostly invented by Gm Adelson in the year 1992, as you can very well see that the tree is named after its inventors as an act of honor. The AVL tree is said to be well-coordinated or well-balanced if both factors of the tree lie between 1 and -1.
Why do we use AVL trees?
We are quite familiar with the complexities of the binary search tree, So in a binary search tree, nearly all the operations such as searching, traversing, deletion, and several others have O(h) complexity where we all know that O is known as the big O, and h stands for the height of the binary search tree. Sometimes, the price of the operations may become O(n) for any tree, especially for a skewed binary tree, and so to overcome that, we have to keep in mind that the height of the tree we are using remains the O (log n) whenever we perform one of the operations such as insertion or deletion.
In that case, we can definitely switch for an upper bound that will have the complexity of O(logn) for several other operations as well. An important key note to keep in mind is that the height of an AVL tree is always O (logn), where we know that O is the big O and n stands for the total number of nodes present in that tree.
Balance Factor of an AVL tree
While studying the AVL tree, we often encounter the term balance factor. We will now understand its basic meaning. Well, the balance factor is nothing but the calculated difference between the height of the left sub-tree and the right sub-tree of a node.
The formula for the balance factor can be written as: -
Balance factor = (height of the left subtree – height of the right subtree).
The self-sustained and well-organized or balancing quality that we discussed previously on the AVL tree is usually maintained by the balance factor, which implies that the value should always lie between – 1 to + 1.
Rotations
We know that sometimes while performing some operations in the tree, it becomes unorganized or imbalanced. So, in order to fix that, we have to perform rotations to transfer and return the balance of the tree. There are majorly four kinds of rotation. They are: -
- RR rotation
When we insert a node in the right sub-tree and that too as the right child, then it is known as RR rotation.
- LL rotation
When we insert a node in the left sub-tree and that too as the left child, then it is known as LL rotation.
- RL rotation
When we insert a node in the left sub-tree and that too as the right child, then it is known as RL rotation.
- LR rotation
When we insert a node in the right sub-tree and that too as the left child, then it is known as LR rotation.
Implementation
// AVL tree implementation in C++
In this article, we are mainly going to see the implementation of the AVL tree and its operations in the C++ language and observe the output as well.
#include <iostream>
using namespace std;
class node {
public:
int key;
Node *left;
Node *right;
int height;
};
int maximum(int x, int y);
// we will calculate the height now: -
int height(node *N) {
if (N == NULL)
return 0;
return N->height;
}
int maximum(int x, int y) {
return (x > y) ? x : y;
}
// We will create a new node now: -
Node *newnode(int key) {
node *node = new node();
(*node).key = key;
(*node).left = NULL;
(*node).right = NULL;
(*node).height = 1;
return (node);
}
// we will now rotate in the right position
node *rightRotate(node *y) {
node *x = y->left;
node *T2 = x->right;
(*x).right = y;
(*y).left = T2;
(*y).height = maximum(height(y->left),
height(y->right)) +
1;
x->height = maximum(height(x->left),
height(x->right)) +
1;
return x;
}
// we will now rotate in the left position
node *leftRotate(node *x) {
node *y = (*x).right;
node *T2 = (*y).left;
(*y).left = x;
(*x).right = T2;
(*x).height = maximum(height(x->left),
height(x->right)) +
1;
(*y).height = maximum(height(y->left),
height(y->right)) +
1;
return y;
}
// Next we will be calculating the balance factor of each node
int getBalanceFactor(node *N) {
if (N == NULL)
return 0;
return height(N->left) -
height(N->right);
}
// We will be inserting a node
node *insertnode(node *node, int key) {
// Now we have to search for the correct postion and insert the node
if (node == NULL)
return (newnode(key));
if (key < node->key)
(*node).left = insertnode(*node).left, key);
else if (key > node->key)
(*node).right = insertnode(*node).right, key);
else
return node;
// We have to change the balance factor of each node and
(*node).height = 1 + maximum(height(*node).left),
height(*node).right));
int balanceFactor = getBalanceFactor(node);
if (balanceFactor > 1) {
if (key < node->left->key) {
return rightRotate(node);
} else if (key > node->left->key) {
node->left = leftRotate(node->left);
return rightRotate(node);
}
}
if (balanceFactor < -1) {
if (key > node->right->key) {
return leftRotate(node);
} else if (key < node->right->key) {
node->right = rightRotate(node->right);
return leftRotate(node);
}
}
return node;
}
// examining the node with the lesser value
node *nodeWithMinimumValue(Node *node) {
node *curr = node;
while (*curr).left != NULL)
current = current->left;
return current;
}
// removing a specific node
node *deletenode(node *root, int key) {
if (root == NULL)
return root;
if (key < root->key)
root->left = deleteNode(root->left, key);
else if (key > root->key)
root->right = deleteNode(root->right, key);
else {
if ((root->left == NULL) ||
(root->right == NULL)) {
Node *temp = (*root).left ? root->left : (*root).right;
if (temp == NULL) {
temp = root;
root = NULL;
} else
*root = *temp;
free(temp);
} else {
node *temp = nodeWithMimumValue(root->right);
root->key = temp->key;
root->right = deletenode(root->right,
temp->key);
}
}
if (root == NULL)
return root;
//Change the balance factor of each given node
(*Root).height = 1 + maximum(height(root->left),
height(*root).right));
int balanceFactor = getBalanceFactor(root);
if (balanceFactor > 1) {
if (getBalanceFactor(root->left) >= 0) {
return rightRotate(root);
} else {
root->left = leftRotate(root->left);
return rightRotate(root);
}
}
if (balanceFactor < -1) {
if (getBalanceFactor(root->right) <= 0) {
return leftRotate(root);
} else {
root->right = rightRotate(root->right);
return leftRotate(root);
}
}
return root;
}
// Final step is to print the constructed tree
void printTree(node *root, bool last) {
if (root != nullp) {
cout << indent;
if (last) {
cout << "R----";
indent += " ";
} else {
cout << "L----";
indent += "| ";
}
cout << root->key << endl;
printTree(root->left, false);
printTree(root->right, true);
}
}
int main() {
*rt = root;
node *rt = NULL;
root = insertnode(33);
root = insertnode(13);
root = insertnode(53);
root = insertnode(9);
root = insertnode(21);
root = insertnode(61);
root = insertnode(8);
root = insertnode(11);
printTree(root, "", true);
root = deleteNode(root, 13);
cout << "After deleting " << endl;
printTree(root, T);
}
Output:
Complexity
In this part of the discussion, we are going to witness the complexity of the various operations in the AVL tree. Let us see in the table given below: -
Case | Time |
Best case | O(log n) |
Average case | O(log n) |
Worst case | O(log n) |
1. BEST CASE
In the AVL tree, the best scenario happens when we don’t require any sort of balancing or rotation. It means that we can traverse or insert the node which we want to without having to make any sort of amendments in the model of the AVL tree. The complexity of the AVL tree in the best scenario is O (log n).
2. AVERAGE CASE
The average case usually occurs when we have to simply take out the mean of all the possible cases based on the number of elements present in the tree. Suppose, if there are n number of elements present in the tree, then the complexity, in this case, is O (log n).
3.WORST CASE
The worst-case complexity usually occurs when we want to insert or traverse to a new node and when we do that, the tree becomes unorganized and imbalanced. Then to fix that, we have to perform some rotations, which will ultimately cost us a lot of time because of the traversal. So, the time complexity of the AVL tree in the worst scenario turns out to be O(log n).
SPACE COMPLEXITY
Space complexity is generally described as the amount of memory a program takes at the time of execution, So the space complexity of the AVL tree is O(n).