CST 370 - Week 7 Learning Journal
Dynamic Programming:
Dynamic programming is a technique used to solve problems by breaking them into smaller overlapping subproblems and storing their solutions to avoid repeated work. It relies on two main ideas: overlapping subproblems and optimal substructure. I am trying to focus more on identifying when a problem should use dynamic programming instead of other techniques like divide and conquer.
Warshall's Algorithm:
Warshall’s algorithm is used to compute the transitive closure of a graph, meaning it determines which vertices are reachable from others. It works by systematically updating a matrix representation of the graph, checking whether a path exists through intermediate vertices. I find it confusing how the matrix updates guarantee that all indirect paths are eventually captured, especially when multiple intermediate nodes are involved.
Floyd's Algorithm:
Floyd’s algorithm is used to find the shortest paths between all pairs of vertices in a weighted graph. It improves paths step by step by considering intermediate vertices, updating distances whenever a shorter path is found.
Greedy Technique and Prim's Algorithm
The greedy technique builds a solution step by step by always choosing the locally optimal option. Prim’s algorithm is a greedy method used to find a minimum spanning tree (MST) by starting with a single vertex and repeatedly adding the smallest edge that connects the tree to a new vertex. I need to better understand why greedy choices always work for Prim's algorithm but can fail for other problems, like what makes a problem suitable for a greedy approach.
Comments
Post a Comment