In computer science dynamic programming is one of the major paradigms of algorithm design. It is a class of solution methods for solving sequential decision problems with a compositional cost structure.
It refers to a problem-solving approach, in which we pre compute and store simpler, similar sub problems, in order to build up the solution to a complex problem. Typically, multistage optimization problems solved by dynamic programming. Also, it used in variety of applications. In this paper, we introduced different implementations of 6 problems by developing dynamic programming solutions. The problems are Fibonacci, also called the Fibonacci sequence, it characterized by the fact that every number after the first two is the sum of the two preceding ones, longest common subsequence.
It is a classic computer science problem, the basis of a file comparison program that outputs the differences between two files 8, Warshall’s algorithm for transitive closure of a directed graph, Floyd’s algorithm for all-pairs shortest paths finds the shortest path in a graph with negative costs of edges but the graph does not contain any negative cycle, constructing an optimal binary search tree. An optimal binary search tree is a binary search tree for which the nodes are arranged on levels such that the tree cost is minimum, and matrix- chain multiplication used to find the efficiency to multiple these matrices together. Performing the multiplications actually is not an issue , but deciding in which order to perform the multiplications. Also, we discussed measuring their time complexity by computing the number of operations made in case of dynamic programming and recursive approach. The maximum number of operations determine the effectiveness of the approach.