Week 7 Dynamic programming and Greedy techniques

This week in Design and Analysis of Algorithms we took a look at two sorting algorithms known as count sort and radix sort, we unraveled Dynamic Programming by analyzing two coin related problems such as the robot coin collection and coin row problem. In addition we explored transitive closure with Warshall's algorithm and found the shortest path of pairs of vertices with Floyd's algorithm which coincidentally utilizes Warshall's Algorithm. And lastly this week we looked at greedy techniques by finding Minimum spanning trees utilizing Prim's algorithm.


The sorting algorithms this week were quite interesting because they didn't utilize comparisons like those that we learned in the previous weeks. The counting algorithm rather utilizes a table to construct the sorting of an array of integers. The table that must be created is a frequency distribution table and essentially it contains the range of the input values, the frequency or the amount of times that input value appears, and the distribution which is the accumulated sum of frequencies in which we utilize to direct the value to its correct position in the output. The time complexity of the counting sort is O(n+k) as it takes scanning the input twice with 2*n and creating the frequency and distribution for 2*k. The other non comparison based sorting algorithm is radix sorting. This one actually utilizes the counting sort but instead applies it on each digit, starting from the least significant digit for each value in the array. As mentioned from the lecture, we start by creating a table with the range from 0 to 9, and start from the right of the array by paying attention to the least significant digit, we do the counting sort based on the LSD, and create the output in a similar manner. After creating the output we then look at the next digit, if there is none then treat it as a 0. Repeat the process until we reach the number of digits in the largest number which results in a sorted array. Overall I thought these were interesting and the exercises were quite memorable.


In addition to the non comparison sorts, we checked out dynamic programming and analyzed two coin related problems. The lecture discusses that dynamic programming is a technique for solving problems with overlapping sub-problems; first by setting up recurrence relation with smaller sub-problems, solving those sub-problems while recording the solutions in a table, and solving the overall main problem using that table. We analyzed this technique through the robot coin collection algorithm, and the coin row problem. The robot coin collection problem involves trying to find the maximum number of coins to collect in each cell and store the results in a matrix by starting at row 1 and moving from left to right until reaching the bottom rightmost cell. From the top of my head, the way to count the coins is by taking the current cell and looking at its left or top cell for a coin count and applying the one with the most coins counted, if there is a coin in the current cell add it to the most counted and apply the total to the current cell. This is repeated until it hits the last cell. From there we can find the optimal path by backtracking from that last cell, and moving left or up depending on the cell with the most count. I think the most confusing part about this at first was understanding the matrix starts at (1,1) rather than normal 0 based indexing. It took me a bit to get used to it.


The coin row problem was a bit easier to understand. It involves creating a table with the index corresponding with the amount of coins, and applying a function to it in order to determine how we can pick up the maximum amount of money with the constraint that no two coins adjacent to each-other can be picked up. It starts with the base case f(0) = 0, and f(1) = the coin amount at the index. Say we start at f(2), we need to find the maximum of the current amount of coins plus the result of the function preceding the one before it, in this case it would be the coins at f(2) + f(0) and comparing it to the results of the function before it which is the result of f(1); doing this until reaching the last index. Personally I thought the exercises for this and the other coin problems were fun to do.


Lastly, I'll just briefly discuss my learning of Floyd Warshall's Algorithm and finding the Minimum Spanning Tree with Prims. We utilized Floyd Warshall's algorithm in our homework, it basically is used to find the shortest paths between all pairs of vertices in a weighted graph. The algorithm works by checking each vertex as an intermediate point between every pair of vertices, for each pair of vertices, it follows if the pair going to the vertex provides a shorter path than the current until all intermediate points are reached which results in shortest distances between every pair of vertices. Prim's algorithm finds a Minimum spanning tree by going step by step, starting at a vertex and adding the smallest edge that connects the next vertex in the tree(the current vertex) to a vertex outside of the tree or remaining vertices. In the lectures and exercises it involved creating a table to track the current vertex, and the remaining vertices. This algorithm is an example of greedy technique due to each step making the best immediate choice, for example choosing the smallest available edge without considering the previous choices.


I can't believe that this class is almost over and the final is right around the corner. I feel like the class should have been divided into two parts with the amount of algorithms we discussed within the short period of time we had. It was actually a fun class and I definitely gained a lot of knowledge from it. Here's to another great week and a fulfilling end to the semester.

Comments

Popular Posts