ABSTRACT

In time analysis, we care about the growth rate of a function as approaches infinity. This allows us to ignore hardware differences and focus on the scalability of the algorithm itself.


The Hierarchy of Functions

Ordered from slowest growth (most efficient) to fastest growth (least efficient):

NotationNameTypical Example
ConstantAccessing an array index
Double LogarithmAdvanced data structure operations
LogarithmicBinary Search
Poly-logarithmicComplex nested logs
LinearLinear Search, Single loop
Log-linearMerge Sort, Quick Sort
QuadraticBubble Sort, Nested loops
PolynomialTriple nested loops ()
ExponentialRecursive Fibonacci, Subset Sum
FactorialTraveling Salesperson (Brute force)
Super-ExponentialExtremely inefficient recursion

Comparison Rules

To determine which algorithm is “better” asymptotically, use these dominant growth rules:

  1. Polynomials vs. Logarithms: Any positive power of () eventually grows faster than any power of .
  2. Polynomials vs. Exponentials: Any exponential () eventually grows faster than any polynomial ().
  3. The Base Matters:
    • For vs : The larger exponent wins ().
    • For vs : The larger base wins ().
  4. Factorials are Massive: grows faster than , but slower than .

Simplification Rules (Big-O Properties)

When analyzing code to find the final complexity, follow these three logic steps:

1. Drop the Constants

As , fixed multipliers become irrelevant.

2. Drop Non-Dominant Terms

Only the “fastest growing” term matters.

3. Products and Sums

  • Sequential Steps: If you do work then work, the total is .
  • Nested Steps: If you do work inside a loop that runs times, the total is .