Chapter 9 Greedy Technique Copyright © 2007 Pearson Addison-Wesley. All rights r

Chapter 9 Greedy Technique Copyright © 2007 Pearson Addison-Wesley. All rights r www.phwiki.com

Chapter 9 Greedy Technique Copyright © 2007 Pearson Addison-Wesley. All rights r

Klinger,, Midday Host has reference to this Academic Journal, PHwiki organized this Journal Chapter 9 Greedy Technique Copyright © 2007 Pearson Addison-Wesley. All rights reserved. Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: feasible, i.e. satisfying the constraints locally optimal (with respect to some neighborhood definition) greedy (in terms of some measure), in addition to irrevocable For some problems, it yields a globally optimal solution as long as every instance. For most, does not but can be useful as long as fast approximations. We are mostly interested in the as long as mer case in this class. Defined by an objective function in addition to a set of constraints Applications of the Greedy Strategy Optimal solutions: change making as long as “normal” coin denominations minimum spanning tree (MST) single-source shortest paths simple scheduling problems Huffman codes Approximations/heuristics: traveling salesman problem (TSP) knapsack problem other combinatorial optimization problems

Universit de La Rochelle FR www.phwiki.com

This Particular University is Related to this Particular Journal

Change-Making Problem Given unlimited amounts of coins of denominations d1 > > dm , give change as long as amount n with the least number of coins Example: d1 = 25c, d2 =10c, d3 = 5c, d4 = 1c in addition to n = 48c Greedy solution: Greedy solution is optimal as long as any amount in addition to “normal’’ set of denominations may not be optimal as long as arbitrary coin denominations <1, 2, 0, 3> For example, d1 = 25c, d2 = 10c, d3 = 1c, in addition to n = 30c Ex: Prove the greedy algorithm is optimal as long as the above denominations. Q: What are the objective function in addition to constraints Minimum Spanning Tree (MST) Spanning tree of a connected graph G: a connected acyclic subgraph of G that includes all of G’s vertices Minimum spanning tree of a weighted, connected graph G: a spanning tree of G of the minimum total weight Example: Prim’s MST algorithm Start with tree T1 consisting of one (any) vertex in addition to “grow” tree one vertex at a time to produce MST through a series of exp in addition to ing subtrees T1, T2, , Tn On each iteration, construct Ti+1 from Ti by adding vertex not in Ti that is closest to those already in Ti (this is a “greedy” step!) Stop when all vertices are included

Example Notes about Prim’s algorithm Proof by induction that this construction actually yields an MST (CLRS, Ch. 23.1). Main property is given in the next page. Needs priority queue as long as locating closest fringe vertex. The Detailed algorithm can be found in Levitin, P. 310. Efficiency O(n2) as long as weight matrix representation of graph in addition to array implementation of priority queue O(m log n) as long as adjacency lists representation of graph with n vertices in addition to m edges in addition to min-heap implementation of the priority queue The Crucial Property behind Prim’s Algorithm Claim: Let G = (V,E) be a weighted graph in addition to (X,Y) be a partition of V (called a cut). Suppose e = (x,y) is an edge of E across the cut, where x is in X in addition to y is in Y, in addition to e has the minimum weight among all such crossing edges (called a light edge). Then there is an MST containing e. y x X Y

Another greedy algorithm as long as MST: Kruskal’s Sort the edges in nondecreasing order of lengths “Grow” tree one edge at a time to produce MST through a series of exp in addition to ing as long as ests F1, F2, , Fn-1 On each iteration, add the next edge on the sorted list unless this would create a cycle. (If it would, skip the edge.) Example Notes about Kruskal’s algorithm Algorithm looks easier than Prim’s but is harder to implement (checking as long as cycles!) Cycle checking: a cycle is created iff added edge connects vertices in the same connected component Union-find algorithms – see section 9.2 Runs in O(m log m) time, with m = E. The time is mostly spent on sorting.

Minimum spanning tree vs. Steiner tree c d b a 1 1 1 1 c d b a vs In general, a Steiner minimal tree (SMT) can be much shorter than a minimum spanning tree (MST), but SMTs are hard to compute. Shortest paths – Dijkstra’s algorithm Single Source Shortest Paths Problem: Given a weighted connected (directed) graph G, find shortest paths from source vertex s to each of the other vertices Dijkstra’s algorithm: Similar to Prim’s MST algorithm, with a different way of computing numerical labels: Among vertices not already in the tree, it finds vertex u with the smallest sum dv + w(v,u) where v is a vertex as long as which shortest path has been already found on preceding iterations (such vertices as long as m a tree rooted at s) dv is the length of the shortest path from source s to v w(v,u) is the length (weight) of edge from v to u Example d 4 Tree vertices Remaining vertices a(-,0) b(a,3) c(-,) d(a,7) e(-,) a b d 4 c e 3 7 4 6 2 5 a b d 4 c e 3 7 4 6 2 5 b(a,3) c(b,3+4) d(b,3+2) e(-,) d(b,5) c(b,7) e(d,5+4) c(b,7) e(d,9) e(d,9) d a b d 4 c e 3 7 4 6 2 5

Notes on Dijkstra’s algorithm Correctness can be proven by induction on the number of vertices. Doesn’t work as long as graphs with negative weights (whereas Floyd’s algorithm does, as long as there is no negative cycle). Can you find a counterexample as long as Dijkstra’s algorithm Applicable to both undirected in addition to directed graphs Efficiency O(V2) as long as graphs represented by weight matrix in addition to array implementation of priority queue O(ElogV) as long as graphs represented by adj. lists in addition to min-heap implementation of priority queue Don’t mix up Dijkstra’s algorithm with Prim’s algorithm! More details of the algorithm are in the text in addition to ref books. We prove the invariants: (i) when a vertex is added to the tree, its correct distance is calculated in addition to (ii) the distance is at least those of the previously added vertices. Coding Problem Coding: assignment of bit strings to alphabet characters Codewords: bit strings assigned as long as characters of alphabet Two types of codes: fixed-length encoding (e.g., ASCII) variable-length encoding (e,g., Morse code) Prefix-free codes (or prefix-codes): no codeword is a prefix of another codeword Problem: If frequencies of the character occurrences are known, what is the best binary prefix-free code It allows as long as efficient (online) decoding! E.g. consider the encoded string (msg) 10010110 E.g. We can code {a,b,c,d} as {00,01,10,11} or {0,10,110,111} or {0,01,10,101}. The one with the shortest average code length. The average code length represents on the average how many bits are required to transmit or store a character. E.g. if P(a) = 0.4, P(b) = 0.3, P(c) = 0.2, P(d) = 0.1, then the average length of code 2 is 0.4 + 20.3 + 30.2 + 30.1 = 1.9 bits Huffman codes Any binary tree with edges labeled with 0’s in addition to 1’s yields a prefix-free code of characters assigned to its leaves Optimal binary tree minimizing the average length of a codeword can be constructed as follows: Huffman’s algorithm Initialize n one-node trees with alphabet characters in addition to the tree weights with their frequencies. Repeat the following step n-1 times: join two binary trees with smallest weights into one (as left in addition to right subtrees) in addition to make its weight equal the sum of the weights of the two trees. Mark edges leading to left in addition to right subtrees with 0’s in addition to 1’s, respectively.

Example character A B C D – frequency 0.35 0.1 0.2 0.2 0.15 codeword 11 100 00 01 101 average bits per character: 2.25 as long as fixed-length encoding: 3 compression ratio: (3-2.25)/3100% = 25%

Klinger, WENN-FM Midday Host www.phwiki.com

Klinger, Midday Host

Klinger, is from United States and they belong to WENN-FM and they are from  Birmingham, United States got related to this Particular Journal. and Klinger, deal with the subjects like Rock and Alternative Music

Journal Ratings by Universit de La Rochelle

This Particular Journal got reviewed and rated by Universit de La Rochelle and short form of this particular Institution is FR and gave this Journal an Excellent Rating.