Data Structures Tutorial

Data Structures Tutorial Asymptotic Notation Structure and Union Array Data Structure Linked list Data Structure Type of Linked list Advantages and Disadvantages of linked list Queue Data Structure Implementation of Queue Stack Data Structure Implementation of Stack Sorting Insertion sort Quick sort Selection sort Heap sort Merge sort Bucket sort Count sort Radix sort Shell sort Tree Traversal of the binary tree Binary search tree Graph Spanning tree Linear Search Binary Search Hashing Collision Resolution Techniques

Misc Topic:

Priority Queue in Data Structure Deque in Data Structure Difference Between Linear And Non Linear Data Structures Queue Operations In Data Structure About Data Structures Data Structures Algorithms Types of Data Structures Big O Notations Introduction to Arrays Introduction to 1D-Arrays Operations on 1D-Arrays Introduction to 2D-Arrays Operations on 2D-Arrays Strings in Data Structures String Operations Application of 2D array Bubble Sort Insertion Sort Sorting Algorithms What is DFS Algorithm What Is Graph Data Structure What is the difference between Tree and Graph What is the difference between DFS and BFS Bucket Sort Dijkstra’s vs Bellman-Ford Algorithm Linear Queue Data Structure in C Stack Using Array Stack Using Linked List Recursion in Fibonacci Stack vs Array What is Skewed Binary Tree Primitive Data Structure in C Dynamic memory allocation of structure in C Application of Stack in Data Structures Binary Tree in Data Structures Heap Data Structure Recursion - Factorial and Fibonacci What is B tree what is B+ tree Huffman tree in Data Structures Insertion Sort vs Bubble Sort Adding one to the number represented an array of digits Bitwise Operators and their Important Tricks Blowfish algorithm Bubble Sort vs Selection Sort Hashing and its Applications Heap Sort vs Merge Sort Insertion Sort vs Selection Sort Merge Conflicts and ways to handle them Difference between Stack and Queue AVL tree in data structure c++ Bubble sort algorithm using Javascript Buffer overflow attack with examples Find out the area between two concentric circles Lowest common ancestor in a binary search tree Number of visible boxes putting one inside another Program to calculate the area of the circumcircle of an equilateral triangle Red-black Tree in Data Structures Strictly binary tree in Data Structures 2-3 Trees and Basic Operations on them Asynchronous advantage actor-critic (A3C) Algorithm Bubble Sort vs Heap Sort Digital Search Tree in Data Structures Minimum Spanning Tree Permutation Sort or Bogo Sort Quick Sort vs Merge Sort Boruvkas algorithm Bubble Sort vs Quick Sort Common Operations on various Data Structures Detect and Remove Loop in a Linked List How to Start Learning DSA Print kth least significant bit number Why is Binary Heap Preferred over BST for Priority Queue Bin Packing Problem Binary Tree Inorder Traversal Burning binary tree Equal Sum What is a Threaded Binary Tree? What is a full Binary Tree? Bubble Sort vs Merge Sort B+ Tree Program in Q language Deletion Operation from A B Tree Deletion Operation of the binary search tree in C++ language Does Overloading Work with Inheritance Balanced Binary Tree Binary tree deletion Binary tree insertion Cocktail Sort Comb Sort FIFO approach Operations of B Tree in C++ Language Recaman’s Sequence Tim Sort Understanding Data Processing

AVL tree in data structure c++

AVL tree is generally known as the self-sustained and most balanced tree in the field of a binary search tree. It was also widely known as the height-balanced binary tree. It was mostly invented by Gm Adelson in the year 1992, as you can very well see that the tree is named after its inventors as an act of honor. The AVL tree is said to be well-coordinated or well-balanced if both factors of the tree lie between 1 and -1.

Why do we use AVL trees?

We are quite familiar with the complexities of the binary search tree, So in a binary search tree, nearly all the operations such as searching, traversing, deletion, and several others have O(h) complexity where we all know that O is known as the big O, and h stands for the height of the binary search tree. Sometimes, the price of the operations may become O(n) for any tree, especially for a skewed binary tree, and so to overcome that, we have to keep in mind that the height of the tree we are using remains the O (log n) whenever we perform one of the operations such as insertion or deletion.

In that case, we can definitely switch for an upper bound that will have the complexity of O(logn) for several other operations as well. An important key note to keep in mind is that the height of an AVL tree is always O (logn), where we know that O is the big O and n stands for the total number of nodes present in that tree.

Balance Factor of an AVL tree

While studying the AVL tree, we often encounter the term balance factor. We will now understand its basic meaning. Well, the balance factor is nothing but the calculated difference between the height of the left sub-tree and the right sub-tree of a node.

The formula for the balance factor can be written as: -

Balance factor = (height of the left subtree – height of the right subtree).

The self-sustained and well-organized or balancing quality that we discussed previously on the AVL tree is usually maintained by the balance factor, which implies that the value should always lie between – 1 to + 1.

Rotations

We know that sometimes while performing some operations in the tree, it becomes unorganized or imbalanced. So, in order to fix that, we have to perform rotations to transfer and return the balance of the tree. There are majorly four kinds of rotation. They are: -

  • RR rotation
    When we insert a node in the right sub-tree and that too as the right child, then it is known as RR rotation.
  • LL rotation
    When we insert a node in the left sub-tree and that too as the left child, then it is known as LL rotation.
  • RL rotation
    When we insert a node in the left sub-tree and that too as the right child, then it is known as RL rotation.
  • LR rotation
    When we insert a node in the right sub-tree and that too as the left child, then it is known as LR rotation.

Implementation

// AVL tree implementation in C++

In this article, we are mainly going to see the implementation of the AVL tree and its operations in the C++ language and observe the output as well.

#include <iostream>
using namespace std;


class node {
   public:
  int key;
  Node *left;
  Node *right;
  int height;
};


int maximum(int x, int y);


// we will calculate the height now: -
int height(node *N) {
  if (N == NULL)
    return 0;
  return N->height;
}


int maximum(int x, int y) {
  return (x > y) ? x : y;
}


// We will create a new node now: -
Node *newnode(int key) {
  node *node = new node();
  (*node).key = key;
  (*node).left = NULL;
  (*node).right = NULL;
  (*node).height = 1;
  return (node);
}


// we will now rotate in the right position
node *rightRotate(node *y) {
  node *x = y->left;
  node *T2 = x->right;
  (*x).right = y;
  (*y).left = T2;
  (*y).height = maximum(height(y->left),
          height(y->right)) +
        1;
  x->height = maximum(height(x->left),
          height(x->right)) +
        1;
  return x;
}


// we will now rotate in the left position
node *leftRotate(node *x) {
  node *y = (*x).right;
  node *T2 = (*y).left;
  (*y).left = x;
  (*x).right = T2;
  (*x).height = maximum(height(x->left),
          height(x->right)) +
        1;
  (*y).height = maximum(height(y->left),
          height(y->right)) +
        1;
  return y;
}


// Next we will be calculating the balance factor of each node
int getBalanceFactor(node *N) {
  if (N == NULL)
    return 0;
  return height(N->left) -
       height(N->right);
}


// We will be inserting a node
node *insertnode(node *node, int key) {
  // Now we have to search for the correct postion and insert the node
  if (node == NULL)
    return (newnode(key));
  if (key < node->key)
    (*node).left = insertnode(*node).left, key);
  else if (key > node->key)
    (*node).right = insertnode(*node).right, key);
  else
    return node;


  // We have to change the balance factor of each node and
  (*node).height = 1 + maximum(height(*node).left),
               height(*node).right));
  int balanceFactor = getBalanceFactor(node);
  if (balanceFactor > 1) {
    if (key < node->left->key) {
      return rightRotate(node);
    } else if (key > node->left->key) {
      node->left = leftRotate(node->left);
      return rightRotate(node);
    }
  }
  if (balanceFactor < -1) {
    if (key > node->right->key) {
      return leftRotate(node);
    } else if (key < node->right->key) {
      node->right = rightRotate(node->right);
      return leftRotate(node);
    }
  }
  return node;
}


// examining the node with the lesser value
node *nodeWithMinimumValue(Node *node) {
  node *curr = node;
  while (*curr).left != NULL)
    current = current->left;
  return current;
}


// removing a specific node
node *deletenode(node *root, int key) {
  if (root == NULL)
    return root;
  if (key < root->key)
    root->left = deleteNode(root->left, key);
  else if (key > root->key)
    root->right = deleteNode(root->right, key);
  else {
    if ((root->left == NULL) ||
      (root->right == NULL)) {
      Node *temp = (*root).left ? root->left : (*root).right;
      if (temp == NULL) {
        temp = root;
        root = NULL;
      } else
        *root = *temp;
      free(temp);
    } else {
      node *temp = nodeWithMimumValue(root->right);
      root->key = temp->key;
      root->right = deletenode(root->right,
                   temp->key);
    }
  }


  if (root == NULL)
    return root;


  //Change the balance factor of each given node


  (*Root).height = 1 + maximum(height(root->left),
               height(*root).right));
  int balanceFactor = getBalanceFactor(root);
  if (balanceFactor > 1) {
    if (getBalanceFactor(root->left) >= 0) {
      return rightRotate(root);
    } else {
      root->left = leftRotate(root->left);
      return rightRotate(root);
    }
  }
  if (balanceFactor < -1) {
    if (getBalanceFactor(root->right) <= 0) {
      return leftRotate(root);
    } else {
      root->right = rightRotate(root->right);
      return leftRotate(root);
    }
  }
  return root;
}


// Final step is to print the constructed tree
void printTree(node *root, bool last) {
  if (root != nullp) {
    cout << indent;
    if (last) {
      cout << "R----";
      indent += "   ";
    } else {
      cout << "L----";
      indent += "|  ";
    }
    cout << root->key << endl;
    printTree(root->left, false);
    printTree(root->right, true);
  }
}


int main() {
*rt = root;
  node *rt = NULL;
  root = insertnode(33);
  root = insertnode(13);
  root = insertnode(53);
  root = insertnode(9);
  root = insertnode(21);
  root = insertnode(61);
  root = insertnode(8);
  root = insertnode(11);
  printTree(root, "", true);
  root = deleteNode(root, 13);
  cout << "After deleting " << endl;
  printTree(root, T);
}

Output:

Difference Between Stack And Queue

Complexity

In this part of the discussion, we are going to witness the complexity of the various operations in the AVL tree. Let us see in the table given below: -

CaseTime
Best caseO(log n)
Average caseO(log n)
Worst caseO(log n)

1. BEST CASE

In the AVL tree, the best scenario happens when we don’t require any sort of balancing or rotation. It means that we can traverse or insert the node which we want to without having to make any sort of amendments in the model of the AVL tree. The complexity of the AVL tree in the best scenario is O (log n).

2. AVERAGE CASE

The average case usually occurs when we have to simply take out the mean of all the possible cases based on the number of elements present in the tree. Suppose, if there are n number of elements present in the tree, then the complexity, in this case, is O (log n).

3.WORST CASE

The worst-case complexity usually occurs when we want to insert or traverse to a new node and when we do that, the tree becomes unorganized and imbalanced. Then to fix that, we have to perform some rotations, which will ultimately cost us a lot of time because of the traversal. So, the time complexity of the AVL tree in the worst scenario turns out to be O(log n).

SPACE COMPLEXITY

Space complexity is generally described as the amount of memory a program takes at the time of execution, So the space complexity of the AVL tree is O(n).



ADVERTISEMENT
ADVERTISEMENT