Cubic Time O(n³) — Triple Nested Loops and Matrix Operations

Learn about O(n³) cubic time complexity, where it arises in matrix multiplication and graph algorithms, and strategies for reducing it.

Complexity Classes

Detailed Explanation

What Is O(n³) Cubic Time?

An algorithm is O(n³) when it performs work proportional to the cube of the input size. This typically arises from three nested loops or from operations on n×n matrices that require per-element computation involving another dimension.

Growth Rate

n
10 1,000
50 125,000
100 1,000,000
500 125,000,000
1,000 1,000,000,000

At n = 1,000, you’re already at a billion operations. Cubic algorithms become impractical for n > ~1,000 on typical hardware.

Classic O(n³) Algorithms

Naive Matrix Multiplication

The standard textbook algorithm for multiplying two n×n matrices:

for i in range(n):
    for j in range(n):
        for k in range(n):
            C[i][j] += A[i][k] * B[k][j]

Floyd-Warshall Algorithm

Finds shortest paths between all pairs of vertices in a weighted graph with n vertices. Three nested loops over all vertices.

LU Decomposition

Factors a matrix into lower and upper triangular matrices, used in solving systems of linear equations.

Faster Alternatives

  • Strassen’s algorithm reduces matrix multiplication to O(n²·¸1) ≈ O(n²·⁸1).
  • Johnson’s algorithm solves all-pairs shortest paths in O(n² log n + n·E) for sparse graphs, beating Floyd-Warshall when edges are few.
  • GPU-accelerated matrix operations use parallelism to handle cubic work in practice.

Use Case

Cubic algorithms appear in scientific computing, graph theory, and linear algebra. Understanding O(n³) helps you decide when to switch to optimized libraries (like BLAS for matrix operations) or approximate algorithms.

Try It — Big-O Reference

Open full tool