Chapter 8 Dynamic Programming Copyright © 2007 Pearson Addison-Wesley. All right

Chapter 8 Dynamic Programming Copyright © 2007 Pearson Addison-Wesley. All right www.phwiki.com

Chapter 8 Dynamic Programming Copyright © 2007 Pearson Addison-Wesley. All right

Daniels, Julie, Midday Host has reference to this Academic Journal, PHwiki organized this Journal Chapter 8 Dynamic Programming Copyright © 2007 Pearson Addison-Wesley. All rights reserved. Dynamic Programming Dynamic Programming is a general algorithm design technique as long as solving problems defined by or as long as mulated as recurrences with overlapping subinstances Invented by American mathematician Richard Bellman in the 1950s to solve optimization problems in addition to later assimilated by CS “Programming” here means “planning” Main idea: set up a recurrence relating a solution to a larger instance to solutions of some smaller instances – solve smaller instances once record solutions in a table extract solution to the initial instance from that table Example: Fibonacci numbers Recall definition of Fibonacci numbers: F(n) = F(n-1) + F(n-2) F(0) = 0 F(1) = 1 Computing the nth Fibonacci number recursively (top-down): F(n) F(n-1) + F(n-2) F(n-2) + F(n-3) F(n-3) + F(n-4)

Universit de Caen Basse Normandie FR www.phwiki.com

This Particular University is Related to this Particular Journal

Example: Fibonacci numbers (cont.) Computing the nth Fibonacci number using bottom-up iteration in addition to recording results: F(0) = 0 F(1) = 1 F(2) = 1+0 = 1 F(n-2) = F(n-1) = F(n) = F(n-1) + F(n-2) Efficiency: – time – space n n What if we solve it recursively Examples of DP algorithms Computing a binomial coefficient Longest common subsequence Warshall’s algorithm as long as transitive closure Floyd’s algorithm as long as all-pairs shortest paths Constructing an optimal binary search tree Some instances of difficult discrete optimization problems: – traveling salesman – knapsack Computing a binomial coefficient by DP Binomial coefficients are coefficients of the binomial as long as mula: (a + b)n = C(n,0)anb0 + + C(n,k)an-kbk + + C(n,n)a0bn Recurrence: C(n,k) = C(n-1,k) + C(n-1,k-1) as long as n > k > 0 C(n,0) = 1, C(n,n) = 1 as long as n 0 Value of C(n,k) can be computed by filling a table: 0 1 2 k-1 k 0 1 1 1 1 n-1 C(n-1,k-1) C(n-1,k) n C(n,k)

Computing C(n,k): pseudocode in addition to analysis Time efficiency: (nk) Space efficiency: (nk) Knapsack Problem by DP Given n items of integer weights: w1 w2 wn values: v1 v2 vn a knapsack of integer capacity W find most valuable subset of the items that fit into the knapsack Consider instance defined by first i items in addition to capacity j (j W). Let V[i,j] be optimal value of such an instance. Then max {V[i-1,j], vi + V[i-1,j- wi]} if j- wi 0 V[i,j] = V[i-1,j] if j- wi < 0 Initial conditions: V[0,j] = 0 in addition to V[i,0] = 0 { Knapsack Problem by DP (example) Example: Knapsack of capacity W = 5 item weight value 1 2 $12 2 1 $10 3 3 $20 4 2 $15 capacity j 0 1 2 3 4 5 0 w1 = 2, v1= 12 1 w2 = 1, v2= 10 2 w3 = 3, v3= 20 3 w4 = 2, v4= 15 4 0 0 0 0 0 12 0 10 12 22 22 22 0 10 12 22 30 32 0 10 15 25 30 37 Knapsack Problem by DP (pseudocode) Algorithm DPKnapsack(w[1 n], v[1 n], W) var V[0 n,0 W], P[1 n,1 W]: int as long as j := 0 to W do V[0,j] := 0 as long as i := 0 to n do V[i,0] := 0 as long as i := 1 to n do as long as j := 1 to W do if w[i] j in addition to v[i] + V[i-1,j-w[i]] > V[i-1,j] then V[i,j] := v[i] + V[i-1,j-w[i]]; P[i,j] := j-w[i] else V[i,j] := V[i-1,j]; P[i,j] := j return V[n,W] in addition to the optimal subset by backtracing Running time in addition to space: O(nW). Longest Common Subsequence (LCS) A subsequence of a sequence/string S is obtained by deleting zero or more symbols from S. For example, the following are some subsequences of “president”: pred, sdn, predent. In other words, the letters of a subsequence of S appear in order in S, but they are not required to be consecutive. The longest common subsequence problem is to find a maximum length common subsequence between two sequences. LCS For instance, Sequence 1: president Sequence 2: providence Its LCS is priden. president providence

LCS Another example: Sequence 1: algorithm Sequence 2: alignment One of its LCS is algm. a l g o r i t h m a l i g n m e n t How to compute LCS Let A=a1a2 am in addition to B=b1b2 bn . len(i, j): the length of an LCS between a1a2 ai in addition to b1b2 bj With proper initializations, len(i, j) can be computed as follows.

Running time in addition to memory: O(mn) in addition to O(mn). The backtracing algorithm

Warshall’s Algorithm: Transitive Closure Computes the transitive closure of a relation Alternatively: existence of all nontrivial paths in a digraph Example of transitive closure: 0 0 1 0 1 0 0 1 0 0 0 0 0 1 0 0 0 0 1 0 1 1 1 1 0 0 0 0 1 1 1 1 Warshall’s Algorithm Constructs transitive closure T as the last matrix in the sequence of n-by-n matrices R(0), , R(k), , R(n) where R(k)[i,j] = 1 iff there is nontrivial path from i to j with only the first k vertices allowed as intermediate Note that R(0) = A (adjacency matrix), R(n) = T (transitive closure) R(0) 0 0 1 0 1 0 0 1 0 0 0 0 0 1 0 0 R(1) 0 0 1 0 1 0 1 1 0 0 0 0 0 1 0 0 R(2) 0 0 1 0 1 0 1 1 0 0 0 0 1 1 1 1 R(3) 0 0 1 0 1 0 1 1 0 0 0 0 1 1 1 1 R(4) 0 0 1 0 1 1 1 1 0 0 0 0 1 1 1 1 Warshall’s Algorithm (recurrence) On the k-th iteration, the algorithm determines as long as every pair of vertices i, j if a path exists from i in addition to j with just vertices 1, ,k allowed as intermediate R(k-1)[i,j] (path using just 1 , ,k-1) R(k)[i,j] = or R(k-1)[i,k] in addition to R(k-1)[k,j] (path from i to k in addition to from k to j using just 1 , ,k-1) i j k { Initial condition

Warshall’s Algorithm (matrix generation) Recurrence relating elements R(k) to elements of R(k-1) is: R(k)[i,j] = R(k-1)[i,j] or (R(k-1)[i,k] in addition to R(k-1)[k,j]) It implies the following rules as long as generating R(k) from R(k-1): Rule 1 If an element in row i in addition to column j is 1 in R(k-1), it remains 1 in R(k) Rule 2 If an element in row i in addition to column j is 0 in R(k-1), it has to be changed to 1 in R(k) if in addition to only if the element in its row i in addition to column k in addition to the element in its column j in addition to row k are both 1’s in R(k-1) Warshall’s Algorithm (example) 0 0 1 0 1 0 0 1 0 0 0 0 0 1 0 0 R(0) = 0 0 1 0 1 0 1 1 0 0 0 0 0 1 0 0 R(1) = 0 0 1 0 1 0 1 1 0 0 0 0 1 1 1 1 R(2) = 0 0 1 0 1 0 1 1 0 0 0 0 1 1 1 1 R(3) = 0 0 1 0 1 1 1 1 0 0 0 0 1 1 1 1 R(4) = Warshall’s Algorithm (pseudocode in addition to analysis) Time efficiency: (n3) Space efficiency: Matrices can be written over their predecessors (with some care), so it’s (n^2).

Daniels, Julie WVOK-AM Midday Host www.phwiki.com

Floyd’s Algorithm: All pairs shortest paths Problem: In a weighted (di)graph, find shortest paths between every pair of vertices Same idea: construct solution through series of matrices D(0), , D (n) using increasing subsets of the vertices allowed as intermediate Example: 0 4 1 0 4 3 0 6 5 1 0 Floyd’s Algorithm (matrix generation) On the k-th iteration, the algorithm determines shortest paths between every pair of vertices i, j that use only vertices among 1, ,k as intermediate D(k)[i,j] = min {D(k-1)[i,j], D(k-1)[i,k] + D(k-1)[k,j]} i j k D(k-1)[i,j] D(k-1)[i,k] D(k-1)[k,j] Initial condition Floyd’s Algorithm (example) 0 3 2 0 7 0 1 6 0 D(0) = 0 3 2 0 5 7 0 1 6 9 0 D(1) = 0 3 2 0 5 9 7 0 1 6 9 0 D(2) = 0 10 3 4 2 0 5 6 9 7 0 1 6 16 9 0 D(3) = 0 10 3 4 2 0 5 6 7 7 0 1 6 16 9 0 D(4) =

Floyd’s Algorithm (pseudocode in addition to analysis) Time efficiency: (n3) Space efficiency: Matrices can be written over their predecessors Note: Works on graphs with negative edges but without negative cycles. Shortest paths themselves can be found, too. How If D[i,k] + D[k,j] < D[i,j] then P[i,j] k Since the superscripts k or k-1 make no difference to D[i,k] in addition to D[k,j]. Optimal Binary Search Trees Problem: Given n keys a1 < < an in addition to probabilities p1, , pn searching as long as them, find a BST with a minimum average number of comparisons in successful search. Since total number of BSTs with n nodes is given by C(2n,n)/(n+1), which grows exponentially, brute as long as ce is hopeless. Example: What is an optimal BST as long as keys A, B, C, in addition to D with search probabilities 0.1, 0.2, 0.4, in addition to 0.3, respectively Average of comparisons = 10.4 + 2(0.2+0.3) + 30.1 = 1.7 DP as long as Optimal BST Problem Let C[i,j] be minimum average number of comparisons made in T[i,j], optimal BST as long as keys ai < < aj , where 1 i j n. Consider optimal BST among all BSTs with some ak (i k j ) as their root; T[i,j] is the best among them. C[i,j] = min {pk · 1 + ps (level as in T[i,k-1] +1) + ps (level as in T[k+1,j] +1)} i k j s = i k-1 s =k+1 j Analysis DP as long as Optimal BST Problem Time efficiency: (n3) but can be reduced to (n2) by taking advantage of monotonicity of entries in the root table, i.e., R[i,j] is always in the range between R[i,j-1] in addition to R[i+1,j] Space efficiency: (n2) Method can be exp in addition to ed to include unsuccessful searches

Daniels, Julie Midday Host

Daniels, Julie is from United States and they belong to WVOK-AM and they are from  Oxford, United States got related to this Particular Journal. and Daniels, Julie deal with the subjects like Music

Journal Ratings by Universit de Caen Basse Normandie

This Particular Journal got reviewed and rated by Universit de Caen Basse Normandie and short form of this particular Institution is FR and gave this Journal an Excellent Rating.