How to Calculate Space Complexity in Data Structure?
Space complexity in data structures refers to the amount of memory used by an algorithm to execute its operations. It is a measure of the amount of storage required by an algorithm to run to completion, including memory used by the data structure, variables, and any temporary memory used during computation. The goal of space complexity analysis is to determine the efficiency of an algorithm in terms of memory usage, as well as to identify potential opportunities for optimization. The space complexity of an algorithm can be expressed in terms of the size of the input data, or in terms of absolute memory units.
Here is how to calculate space complexity
- First, determine the amount of memory used by the data structure. For example, if you are using an array, this would be the memory used to store the array elements.
- After that, determine the memory used by any variables. For example, if you are using a variable to store the length of the array, you will need to add the memory used by that variable.
- Then, determine any temporary memory used during computation. For example, if you are sorting an array, you may need to use additional memory to store temporary values during the sorting process.
- After that, add up the memory used by the data structure, variables, and temporary memory to get the total space complexity of the algorithm.
Some methods for calculating space complexity
- Counting Primitive Variables: The space used by primitive data types such as integers, floats, and Booleans can be counted as O(1) as they take a constant amount of space.
- Counting Non-Primitive Variables: The space used by non-primitive data types such as arrays, linked lists, trees, and graphs can be calculated based on their size. For example, an array of size n takes O(n) space, while a linked list node takes O(1) space.
- Counting Recursive Function Calls: Recursive function calls can take a significant amount of space. The space used by each call can be calculated as O(k) where k is the maximum number of variables used by the function. The total space used by all calls is then calculated as O(k * l) where l is the maximum number of calls made.
- Dynamic Memory Allocation: Dynamic memory allocation can also be a source of space complexity. When using functions such as malloc and calloc, the space used by the allocation should be counted.
- Auxiliary Space: The space used by auxiliary data structures, such as stacks and queues, should also be counted when calculating space complexity.
To calculate space complexity, you can add up the amount of memory used by the algorithm at any given moment.
The formula for space complexity is usually expressed as:
S = T + M
Where S is the space complexity, T is the amount of memory used by the data structure, and M is the amount of memory used by any other variables and auxiliary memory.
Note: It is important to note that the space complexity of an algorithm depends on the specific implementation and may change based on the size of the input data.
Here is an example of how we can calculate the space complexity of an algorithm in C language:
#include <stdio.h>
void func(int n) {
int a[n]; // array with n elements, takes O(n) space
int b = n; // variable b takes O(1) space
}
int main() {
int n = 10;
func(n);
return 0;
}
In this example, the space complexity of the func function is O(n), as it depends on the size of the input n. The “a” array takes O(n) space, and the “b” variable takes O(1) space.
So, the total space complexity of the algorithm is O(n) + O(1) = O(n).
Space complexity cheat sheet for algorithms
Arrays and Strings:
- Allocating an array of size n: O(n)
- Accessing an array element: O(1)
- String concatenation: O(n) where n is the length of the longest string.
Linked List:
- Creating a linked list node: O(1)
- Traversing a linked list: O(n)
- Inserting or deleting a node: O(1) for inserting/deleting at the head and O(n) for inserting/deleting at the tail.
Stacks and Queues:
- Pushing or popping from a stack: O(1)
- Enqueue or dequeue from a queue: O(1)
Hash Tables:
- Lookup in a hash table: O(1) average case, O(n) worst case.
- Inserting or deleting in a hash table: O(1) average case, O(n) worst case.
Trees:
- Traversing a tree: O(n)
- Searching for a node in a tree: O(n) average case, O(log n) best case (for a balanced tree).
- Inserting or deleting a node in a tree: O(n) average case, O(log n) best case (for a balanced tree).
Graphs:
- Representing a graph: O(n + m) where n is the number of nodes and m is the number of edges.
- Traversing a graph: O(n + m) where n is the number of nodes and m is the number of edges.