Quadratic Time O(n²) — When Nested Loops Dominate

Understand O(n²) quadratic time complexity, why nested loops cause it, and how to recognize and avoid it. Covers bubble sort, selection sort, and matrix operations.

Complexity Classes

Detailed Explanation

What Is O(n²) Quadratic Time?

An algorithm is O(n²) when its runtime grows with the square of the input size. The classic cause is two nested loops, each iterating over n elements:

for i in range(n):
    for j in range(n):
        # O(1) work

Growth Rate

n
10 100
100 10,000
1,000 1,000,000
10,000 100,000,000
100,000 10,000,000,000

At n = 100,000, a quadratic algorithm performs 10 billion operations — taking minutes or hours on modern hardware.

Common O(n²) Algorithms

  • Bubble sort: Repeatedly compare adjacent elements and swap.
  • Selection sort: Find the minimum in the unsorted portion and move it.
  • Insertion sort: Insert each element into its correct position in the sorted portion.
  • Naive string matching: Check for a pattern at every position in the text.
  • Checking all pairs: Any problem that compares every pair of elements (e.g., finding duplicates without a hash set).

When O(n²) Is Acceptable

Quadratic algorithms are fine for small inputs (n < 1,000). In fact, insertion sort is often faster than merge sort for small arrays due to lower constant factors and cache friendliness — that’s why Tim sort uses insertion sort for small subarrays.

How to Avoid O(n²)

  1. Use a hash table to avoid nested lookups: O(n) instead of O(n²).
  2. Sort first, then use binary search or two pointers.
  3. Divide and conquer to reduce to O(n log n).
  4. Dynamic programming to avoid recomputing overlapping subproblems.

Use Case

Understanding quadratic complexity is essential for recognizing performance bottlenecks. In code reviews, spotting an unnecessary O(n²) nested loop and replacing it with an O(n) or O(n log n) alternative is one of the highest-impact optimizations.

Try It — Big-O Reference

Open full tool