Download, print and study this document offline |
Page 1 Greedy Algorithms (Part -1) Greedy Design Technique • Greedy algorithms uses the heuristic of making the locally optimal choice at each stage of problem solving, with the hope of finding a globally optimal. • An optimization problem can be solved using following Greedy approach. ° Greedy-choice property: A global optimum can be arrived at by selecting a local optimum. ° Optimal substructure: An optimal solution to the problem contains an optimal solution to subproblems. Greedy Algorithms • Single Source shortest path (Dijkstra's Algorithm) • Minimum Spanning Tree (Kruskal's algorithm & Prim's algorithm) • Huffman Coding (Optimal prefix code) • Fractional knapsack problem • Job Sequencing with deadline Huffman Coding • Storage space for files can be saved by compressing them where each symbol can be replaced by a unique binary string. • Codewords can differ in length. • Each code needs to be prefix free that is no codeword is a prefix of another code. • The Huffman coding is a greedy method for obtaining an optimal prefix-free binary code. Page 2 Greedy Algorithms (Part -1) Greedy Design Technique • Greedy algorithms uses the heuristic of making the locally optimal choice at each stage of problem solving, with the hope of finding a globally optimal. • An optimization problem can be solved using following Greedy approach. ° Greedy-choice property: A global optimum can be arrived at by selecting a local optimum. ° Optimal substructure: An optimal solution to the problem contains an optimal solution to subproblems. Greedy Algorithms • Single Source shortest path (Dijkstra's Algorithm) • Minimum Spanning Tree (Kruskal's algorithm & Prim's algorithm) • Huffman Coding (Optimal prefix code) • Fractional knapsack problem • Job Sequencing with deadline Huffman Coding • Storage space for files can be saved by compressing them where each symbol can be replaced by a unique binary string. • Codewords can differ in length. • Each code needs to be prefix free that is no codeword is a prefix of another code. • The Huffman coding is a greedy method for obtaining an optimal prefix-free binary code. Example: Find the average word length for the following letters with the given frequency of use of the letters. E - 40% ; A - 30%; M-20%; N-5% ; T-5% Solution: Following huffman tree can be constructed using the given frequencies. L etter C o d e L en gth F req u en cy L en gth x fr e q u e n c y p r o p o rtio n E 1 1 0.4 0.4 A 00 2 0.3 0.6 M 010 3 0 2 0.6 N 0110 4 0.05 0.2 T 0111 4 0.05 0.2 Total 1 » 1 Average word length is 2. Time Complexity of Huffman Encoding : Time Complexity, T(n) = (Sort the elements) + (Pick them one by one and add) T(n) = O(nlogn) + O(n) Hence T(n) = O(nlogn) Fractional Knapsack Given weights and values of 'n' items, we need put these items in a knapsack of capacity W to get the maximum total value in the knapsack. We have two types of knapsack: • 0/1 Knapsack : We are not allowed to break items. Either take the full item or leave it. Page 3 Greedy Algorithms (Part -1) Greedy Design Technique • Greedy algorithms uses the heuristic of making the locally optimal choice at each stage of problem solving, with the hope of finding a globally optimal. • An optimization problem can be solved using following Greedy approach. ° Greedy-choice property: A global optimum can be arrived at by selecting a local optimum. ° Optimal substructure: An optimal solution to the problem contains an optimal solution to subproblems. Greedy Algorithms • Single Source shortest path (Dijkstra's Algorithm) • Minimum Spanning Tree (Kruskal's algorithm & Prim's algorithm) • Huffman Coding (Optimal prefix code) • Fractional knapsack problem • Job Sequencing with deadline Huffman Coding • Storage space for files can be saved by compressing them where each symbol can be replaced by a unique binary string. • Codewords can differ in length. • Each code needs to be prefix free that is no codeword is a prefix of another code. • The Huffman coding is a greedy method for obtaining an optimal prefix-free binary code. Example: Find the average word length for the following letters with the given frequency of use of the letters. E - 40% ; A - 30%; M-20%; N-5% ; T-5% Solution: Following huffman tree can be constructed using the given frequencies. L etter C o d e L en gth F req u en cy L en gth x fr e q u e n c y p r o p o rtio n E 1 1 0.4 0.4 A 00 2 0.3 0.6 M 010 3 0 2 0.6 N 0110 4 0.05 0.2 T 0111 4 0.05 0.2 Total 1 » 1 Average word length is 2. Time Complexity of Huffman Encoding : Time Complexity, T(n) = (Sort the elements) + (Pick them one by one and add) T(n) = O(nlogn) + O(n) Hence T(n) = O(nlogn) Fractional Knapsack Given weights and values of 'n' items, we need put these items in a knapsack of capacity W to get the maximum total value in the knapsack. We have two types of knapsack: • 0/1 Knapsack : We are not allowed to break items. Either take the full item or leave it. • Fractional Knapsack: In order to maximize the value, we are allowed to bream the items and take fractions of them. Greedy Algorithm will solve Fractional Knapsack problem , but it will give wrong answer for 0/1 Knapsack. So we will see 0/1 knapsack in dynamic programming. Example: Consider a knapsack with capacity = 20 and fractions are allowed. Consider three items II , 1 2 and 1 3 with following weight and profit. ITEM 1 1 1 2 1 3 PROFIT 25 18 8 WEIGHT 18 10 6 Find the maximum profit possible using fractional knapsack . Step 1 : Sort the items according to their per unit profit. Step 2 : Now pick them up in that order and break them in fractions if they have more weight than knapsack capacity. Step 3 : Keep picking them until knapsack is full. Item 1 1 1 2 1 3 Profit 25 18 8 Weight 18 10 6 Profit/weight 25/18 = 1.38 18/10 = 1.8 8/6 = 1.33 As object 2 has maximum per unit profit, so pick that first. Current knapsack = 10, profit = 18 Now next object 1 has maximum profit, so we will pick it. But it has more weight than remaining capacity of knapsack, hence we have to break it in fractions. Remaining capacity = 10 So pick weight = 10 from II , hence profit from II = (10/18)*25 Now, knapsack is full Total Profit = (1 *18) + (10/18)*25 = 31.8 Time Complexity of Fractional Knapsack : Time Complexity, T(n) = Sorting according to priority + Picking up objects = nlogn + n T(n) = O(nlogn) Job Sequencing with deadline Input: Given an array of jobs where every job has a deadline and associated profit if the job is finished before the deadline. It is also given that every job takes single unit of time, so the minimum possible deadline for any job is 1. Output: To maximize total profit if only one job can be scheduled at a time. Example : Find the maximum possible profit to complete the jobs in their respective deadline. Page 4 Greedy Algorithms (Part -1) Greedy Design Technique • Greedy algorithms uses the heuristic of making the locally optimal choice at each stage of problem solving, with the hope of finding a globally optimal. • An optimization problem can be solved using following Greedy approach. ° Greedy-choice property: A global optimum can be arrived at by selecting a local optimum. ° Optimal substructure: An optimal solution to the problem contains an optimal solution to subproblems. Greedy Algorithms • Single Source shortest path (Dijkstra's Algorithm) • Minimum Spanning Tree (Kruskal's algorithm & Prim's algorithm) • Huffman Coding (Optimal prefix code) • Fractional knapsack problem • Job Sequencing with deadline Huffman Coding • Storage space for files can be saved by compressing them where each symbol can be replaced by a unique binary string. • Codewords can differ in length. • Each code needs to be prefix free that is no codeword is a prefix of another code. • The Huffman coding is a greedy method for obtaining an optimal prefix-free binary code. Example: Find the average word length for the following letters with the given frequency of use of the letters. E - 40% ; A - 30%; M-20%; N-5% ; T-5% Solution: Following huffman tree can be constructed using the given frequencies. L etter C o d e L en gth F req u en cy L en gth x fr e q u e n c y p r o p o rtio n E 1 1 0.4 0.4 A 00 2 0.3 0.6 M 010 3 0 2 0.6 N 0110 4 0.05 0.2 T 0111 4 0.05 0.2 Total 1 » 1 Average word length is 2. Time Complexity of Huffman Encoding : Time Complexity, T(n) = (Sort the elements) + (Pick them one by one and add) T(n) = O(nlogn) + O(n) Hence T(n) = O(nlogn) Fractional Knapsack Given weights and values of 'n' items, we need put these items in a knapsack of capacity W to get the maximum total value in the knapsack. We have two types of knapsack: • 0/1 Knapsack : We are not allowed to break items. Either take the full item or leave it. • Fractional Knapsack: In order to maximize the value, we are allowed to bream the items and take fractions of them. Greedy Algorithm will solve Fractional Knapsack problem , but it will give wrong answer for 0/1 Knapsack. So we will see 0/1 knapsack in dynamic programming. Example: Consider a knapsack with capacity = 20 and fractions are allowed. Consider three items II , 1 2 and 1 3 with following weight and profit. ITEM 1 1 1 2 1 3 PROFIT 25 18 8 WEIGHT 18 10 6 Find the maximum profit possible using fractional knapsack . Step 1 : Sort the items according to their per unit profit. Step 2 : Now pick them up in that order and break them in fractions if they have more weight than knapsack capacity. Step 3 : Keep picking them until knapsack is full. Item 1 1 1 2 1 3 Profit 25 18 8 Weight 18 10 6 Profit/weight 25/18 = 1.38 18/10 = 1.8 8/6 = 1.33 As object 2 has maximum per unit profit, so pick that first. Current knapsack = 10, profit = 18 Now next object 1 has maximum profit, so we will pick it. But it has more weight than remaining capacity of knapsack, hence we have to break it in fractions. Remaining capacity = 10 So pick weight = 10 from II , hence profit from II = (10/18)*25 Now, knapsack is full Total Profit = (1 *18) + (10/18)*25 = 31.8 Time Complexity of Fractional Knapsack : Time Complexity, T(n) = Sorting according to priority + Picking up objects = nlogn + n T(n) = O(nlogn) Job Sequencing with deadline Input: Given an array of jobs where every job has a deadline and associated profit if the job is finished before the deadline. It is also given that every job takes single unit of time, so the minimum possible deadline for any job is 1. Output: To maximize total profit if only one job can be scheduled at a time. Example : Find the maximum possible profit to complete the jobs in their respective deadline. JOBS J1 J2 J3 J4 PROFIT 100 200 150 250 DEADLINES 2 1 2 1 The possible solutions are : (J2,J3) -> profit = 350 (J2,J1) -> profit = 300 (J4,J3) -> profit = 400 < —- OPTIMAL (J4,J1) -> profit = 350 and so on There can be many solutions possible but optimal is where profit is maximum. ALGORITHM : • Try to fill the slots from back to front. • To find the exact location use searching algorithms. • If required , sort the jobs according to the profit. Example : Find the maximum possible profit to complete the jobs in their respective deadline. JOBS J1 J2 J3 J4 J5 J6 J7 PROFIT 25 45 50 15 75 120 55 DEADLINES 5 3 4 2 1 3 2 Solution: J5 J3 J6 J7 J5 1 2 3 4 5 Total Profit = 25 + 50 + 120 + 55 + 75 = 325 Time Complexity of Job Sequencing : To select the right location of a job , we have to perform linear search for every job. Hence, Time to place one job = 0(n) Time to place n jobs = 0(nA 2) So that's all about the first three applications of Greedy technique. The next will be spanning tree.Read More
|
Explore Courses for Computer Science Engineering (CSE) exam
|