Chapter 27: Online Algorithms: Caching, Search Lists, and Competitive Analysis
Loading audio…
ⓘ This audio and summary are simplified educational interpretations and are not a substitute for the original text.
Competitive analysis provides the formal framework for evaluating online performance by comparing the cost of an online algorithm A(I) to the cost of an optimal offline algorithm F(I) on the same input, with the competitive ratio defined as the worst-case ratio A(I)/F(I) across all inputs, and an algorithm called c-competitive if it never exceeds c times the offline optimum. A simple elevator example illustrates the hedging principle: always waiting or always taking the stairs both yield poor competitive ratios, while a strategy of waiting at most k minutes before switching achieves a competitive ratio of 2 — the best possible. The move-to-front (MTF) heuristic for search list maintenance accesses an element at position r for cost 2r–1 and immediately moves it to the front, and Theorem 27.1 proves MTF is 4-competitive against FORESEE, an optimal offline algorithm, using an inversion-count argument to bound the difference in list orderings. For the online caching problem, where a fixed-size cache must handle sequential block requests and choose eviction victims on misses without knowing future requests, LRU and FIFO both achieve competitive ratio O(k) while LIFO and LFU perform arbitrarily poorly, and Theorem 27.4 establishes that no deterministic caching algorithm can do better than Ω(k). Randomized algorithms can beat this deterministic lower bound: the RANDOMIZED-MARKING algorithm, which tracks marked blocks and evicts a uniformly random unmarked block on a miss, achieves an expected competitive ratio of O(log k) against an oblivious adversary that cannot observe the algorithm's random choices.