Space Complexity Basics — Understanding Memory Usage
Learn how to analyze space complexity: auxiliary space, in-place algorithms, recursion stack space, and the time-space tradeoff. Includes examples for common algorithms.
Detailed Explanation
What Is Space Complexity?
Space complexity measures how much additional memory an algorithm uses as a function of input size. It includes:
- Auxiliary space — extra space used beyond the input (temporary arrays, hash tables, etc.)
- Recursion stack space — memory used by recursive call frames
The input itself is usually not counted unless the algorithm modifies it in place.
Space Complexity of Common Algorithms
| Algorithm | Time | Space | Notes |
|---|---|---|---|
| Binary search | O(log n) | O(1) | Iterative version; recursive uses O(log n) stack |
| Merge sort | O(n log n) | O(n) | Needs auxiliary array for merging |
| Quick sort | O(n log n) avg | O(log n) | Stack space for recursion |
| Heap sort | O(n log n) | O(1) | In-place sorting |
| Counting sort | O(n + k) | O(k) | Needs count array of size k (range of values) |
| DFS (graph) | O(V + E) | O(V) | Visited set + recursion stack |
| BFS (graph) | O(V + E) | O(V) | Queue can hold up to V vertices |
In-Place Algorithms
An in-place algorithm uses O(1) auxiliary space (not counting the input). Examples:
- Heap sort
- Selection sort
- Insertion sort
- Quick sort (O(log n) for recursion, but no auxiliary array)
The Time-Space Tradeoff
Often you can trade memory for speed:
- Hash tables use O(n) space to achieve O(1) lookups (vs. O(n) lookups with O(1) space).
- Memoization uses O(n) space to turn O(2ⁿ) recursion into O(n) time.
- Precomputation tables use space to avoid repeated calculations.
Recursion Space
Every recursive call adds a frame to the call stack. A recursive function with depth d uses O(d) stack space:
- Binary search: O(log n) depth
- Quick sort: O(log n) average, O(n) worst case
- Naive tree DFS: O(h) where h is the tree height
Use Case
Space complexity analysis is critical for embedded systems, mobile apps, and any environment with limited memory. It also matters for cache performance — algorithms that use less memory often have better cache locality.
Try It — Big-O Reference
Related Topics
Constant Time O(1) — Operations That Never Slow Down
Complexity Classes
Sorting Algorithm Comparison — Time, Space, and Stability
Algorithms
Dynamic Programming Complexity — Trading Space for Time
Analysis Concepts
Hash Table Complexity — O(1) Average, O(n) Worst Case
Data Structures
Amortized Analysis — Average Cost Over a Sequence of Operations
Analysis Concepts