Linearithmic Time O(n log n) — The Optimal Sorting Boundary
Explore O(n log n) linearithmic complexity, the theoretical lower bound for comparison-based sorting. Covers merge sort, quicksort, heapsort, and the comparison model proof.
Detailed Explanation
What Is O(n log n) Linearithmic Time?
O(n log n) means the algorithm does about n × log₂(n) operations. This is slightly worse than linear but dramatically better than quadratic. It is the optimal time complexity for comparison-based sorting — you cannot sort faster by comparing elements.
The Comparison-Based Sorting Lower Bound
Any sorting algorithm that works by comparing pairs of elements must make at least log₂(n!) comparisons in the worst case. By Stirling’s approximation:
log₂(n!) ≈ n log₂ n - n log₂ e ∈ Θ(n log n)
This means merge sort, quicksort (average), and heapsort are asymptotically optimal among comparison sorts.
O(n log n) Sorting Algorithms
| Algorithm | Worst Case | Average Case | Space | Stable? |
|---|---|---|---|---|
| Merge sort | O(n log n) | O(n log n) | O(n) | Yes |
| Quick sort | O(n²) | O(n log n) | O(log n) | No |
| Heap sort | O(n log n) | O(n log n) | O(1) | No |
| Tim sort | O(n log n) | O(n log n) | O(n) | Yes |
Beyond Sorting
O(n log n) also appears in:
- Divide-and-conquer algorithms where you split, solve, and merge (the recurrence
T(n) = 2T(n/2) + O(n)yields O(n log n) by the Master Theorem). - FFT (Fast Fourier Transform) for signal processing and polynomial multiplication.
- Closest pair of points in computational geometry.
Practical Impact
At n = 1,000,000:
- O(n log n) ≈ 20,000,000 operations
- O(n²) = 1,000,000,000,000 operations
That’s a 50,000x difference — the reason why choosing the right sorting algorithm matters enormously.
Use Case
O(n log n) is the target complexity for sorting large datasets, building search indices, computing convex hulls, and any divide-and-conquer processing where merge cost is linear.
Try It — Big-O Reference
Related Topics
Linear Time O(n) — Processing Every Element Once
Complexity Classes
Quadratic Time O(n²) — When Nested Loops Dominate
Complexity Classes
Sorting Algorithm Comparison — Time, Space, and Stability
Algorithms
Best, Worst, and Average Case — Understanding Algorithm Bounds
Analysis Concepts
Dynamic Programming Complexity — Trading Space for Time
Analysis Concepts