ABSTRACT
In time analysis, we care about the growth rate of a function as approaches infinity. This allows us to ignore hardware differences and focus on the scalability of the algorithm itself.
The Hierarchy of Functions
Ordered from slowest growth (most efficient) to fastest growth (least efficient):
| Notation | Name | Typical Example |
|---|---|---|
| Constant | Accessing an array index | |
| Double Logarithm | Advanced data structure operations | |
| Logarithmic | Binary Search | |
| Poly-logarithmic | Complex nested logs | |
| Linear | Linear Search, Single loop | |
| Log-linear | Merge Sort, Quick Sort | |
| Quadratic | Bubble Sort, Nested loops | |
| Polynomial | Triple nested loops () | |
| Exponential | Recursive Fibonacci, Subset Sum | |
| Factorial | Traveling Salesperson (Brute force) | |
| Super-Exponential | Extremely inefficient recursion |
Comparison Rules
To determine which algorithm is “better” asymptotically, use these dominant growth rules:
- Polynomials vs. Logarithms: Any positive power of () eventually grows faster than any power of .
- Polynomials vs. Exponentials: Any exponential () eventually grows faster than any polynomial ().
- The Base Matters:
- For vs : The larger exponent wins ().
- For vs : The larger base wins ().
- Factorials are Massive: grows faster than , but slower than .
Simplification Rules (Big-O Properties)
When analyzing code to find the final complexity, follow these three logic steps:
1. Drop the Constants
As , fixed multipliers become irrelevant.
2. Drop Non-Dominant Terms
Only the “fastest growing” term matters.
3. Products and Sums
- Sequential Steps: If you do work then work, the total is .
- Nested Steps: If you do work inside a loop that runs times, the total is .
🔗 Related Notes
- Asymptotic Notation — The formal definitions of , , and .
- Time Analysis For Recursion — Applying these rules to relations.
- Stirling’s Approximation — How to handle the growth of .