Chapter 3: Characterizing Running Times: Asymptotic Notation
Loading audio…
ⓘ This audio and summary are simplified educational interpretations and are not a substitute for the original text.
The three primary notations are big-Θ, which describes a tight bound where a function grows at exactly the same rate as another up to constant factors; big-O, which provides an asymptotic upper bound; and big-Ω, which provides an asymptotic lower bound — all defined using constants c and a threshold n₀ beyond which the inequality must hold. For stricter theoretical comparisons, little-o and little-ω denote non-tight upper and lower bounds respectively, indicating strictly slower or faster growth, and the full set of notations maps intuitively onto the familiar numeric comparisons ≤, ≥, =, <, and >, with the caveat that unlike real numbers, asymptotic trichotomy does not always hold. The chapter then surveys the standard mathematical functions that appear throughout algorithm analysis: floors and ceilings, modulo arithmetic, polynomials (where a degree-d polynomial is Θ(nᵈ)), and exponentials, which dominate all polynomials for large n. Logarithms grow slower than any polynomial, while factorials grow faster than any polynomial — approximated by Stirling's formula, which yields lg(n!) = Θ(n log n). Rounding out the chapter are functional iteration and Fibonacci numbers, the latter growing exponentially at approximately φⁱ/√5 where φ ≈ 1.618, and appearing naturally in the analysis of divide-and-conquer and dynamic programming algorithms.