Chapter 14: Dynamic Programming: Rod Cutting, LCS, and Optimal BSTs

Loading audio…

ⓘ This audio and summary are simplified educational interpretations and are not a substitute for the original text.

If there is an issue with this chapter, please let us know → Contact Us

The design process follows four steps — characterizing the structure of an optimal solution, defining its value recursively, computing that value either bottom-up via tabulation or top-down via memoization, and optionally reconstructing the full solution. The rod-cutting problem illustrates this concretely: a naïve recursive approach runs in O(2ⁿ) time, but both memoization and bottom-up DP reduce it to O(n²) by storing previously computed revenues, with a note that a greedy price-per-inch strategy is not always optimal. Matrix-chain multiplication demonstrates a more complex application, where the goal is finding the optimal parenthesization of a matrix product to minimize scalar multiplications — solved in O(n³) time by filling tables m[i,j] and s[i,j] that record minimum costs and split points respectively. The longest common subsequence problem finds the longest shared subsequence between two strings in Θ(mn) time using a two-dimensional table, with space reducible to O(min(m,n)) when only the length is needed. Finally, optimal binary search trees minimize expected search cost over a set of keys with known access probabilities — including dummy keys for unsuccessful searches — building tables of expected costs and optimal roots in O(n³) time and Θ(n²) space, with practical applications in compiler optimization and database indexing.