0% found this document useful (0 votes)
62 views75 pages

DAA-Module 4

The document discusses various dynamic programming algorithms including Floyd's algorithm for finding all pairs shortest paths, Warshall's algorithm for finding the transitive closure of a graph, solving the knapsack problem and travelling salesman problem using dynamic programming, and Bellman-Ford algorithm for single source shortest paths in graphs with negative edge weights. It provides explanations of the algorithms and examples of their applications.

Uploaded by

Zeha 1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
62 views75 pages

DAA-Module 4

The document discusses various dynamic programming algorithms including Floyd's algorithm for finding all pairs shortest paths, Warshall's algorithm for finding the transitive closure of a graph, solving the knapsack problem and travelling salesman problem using dynamic programming, and Bellman-Ford algorithm for single source shortest paths in graphs with negative edge weights. It provides explanations of the algorithms and examples of their applications.

Uploaded by

Zeha 1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 75

1

MODULE 4:
DYNAMIC PROGRAMMING
Dynamic Programming:
2

 Dynamic Programming:
 Transitive Closure: Warshall’s Algorithm,
 All Pairs Shortest Paths: Floyd's Algorithm,
 Optimal Binary Search Trees,
 Knapsack problem,
 Bellman-Ford Algorithm,
 Travelling Sales Person problem
Dynamic Programming:
3

• Dynamic programming is an algorithm design technique.


• Dynamic programming is a technique for solving problems
with overlapping subproblems.
– subproblems arise from a recurrence relating a given problem’s
solution to solutions of its smaller subproblems.
– It suggests solving each of the smaller subproblems only once and
recording the results in a table from which a solution to the
original problem can then be obtained.
• The Dynamic programming can be used when the solution to a
problem can be viewed as the result of sequence of decisions.
Principle of Optimality
4

 Difference between greedy and DP?


All Pairs Shortest Paths
5

• Problem definition: Given a weighted connected graph


(undirected or directed), the all-pairs shortest paths
problem asks to find the distances—i.e., the lengths of the
shortest paths - from each vertex to all other vertices.

• Applications:
– Communications, transportation networks, and operations
research.
– pre-computing distances for motion planning in computer games.
All Pairs Shortest Paths
Floyd’s algorithm
6
Floyds Algorithm
7

 Floyd's algorithm computes the


distance matrix of a weighted
graph with n vertices through a
series of n-by-n matrices:
8
Analysis
9

 Its time efficiency is Θ(n3)


10
Warshall’s Algorithm
11

 Warshall's algorithm for computing the transitive


closure of a directed graph.
 adjacency matrix A= {aij} of a directed graph is the
boolean matrix that has 1 in its ith row and jth column
if and only if there is a directed edge from the ith
vertex to the jth vertex.
 We may also be interested in a matrix containing the
information about the existence of directed paths of
arbitrary lengths between vertices of a given graph.
Transitive Closure: Warshall’s Algorithm
12

 Definition: The transitive closure of a directed


graph with n vertices can be defined as the
n x n boolean matrix T = {tij }, in which the
element in the ith row `and the jth column is 1 if
there exists a nontrivial path (i.e., directed path
of a positive length) from the ith vertex to the jth
vertex; otherwise, tij is 0.
Transitive Closure
13
14

• We can generate the transitive closure of a digraph


with the help of depth-first search or breadth-first
search.
• Since this method traverses the same digraph several
times, we can use a better algorithm called
 Warshall’s algorithm.
• Warshall’s algorithm constructs the transitive closure
through a series of n × n boolean matrices:
Warshalls Algorithm
15

• The element in the ith row and jth column of matrix R(k) is equal
to 1 if and only if
 – there exists a directed path of a positive length from the
ith vertex to the jth vertex with each intermediate vertex, if
any, numbered not higher than k.
 This means that there exists a path from the ith vertex vi to
the jth vertex vj with each intermediate vertex numbered
not higher than k:
 vi, a list of intermediate vertices each numbered not higher

 than k, vj
16
17
18
Algorithm
19
20

• Its time efficiency is Θ(n3).


• We can make the algorithm to run faster by
treating matrix rows as bit strings and employ the
bitwise or operation available in most modern
computer languages.
• Space efficiency
 – Although separate matrices for recording
intermediate results of the algorithm are
used, that can be avoided.
21
22
23
24
Travelling Salesman Problem
25

 Problem Statement
• The travelling salesman problem consists of a salesman
and a set of cities.
• The salesman has to visit each one of the cities exactly
once starting from a certain one (e.g. the hometown) and
returning to the same city.
• The challenge of the problem is that the traveling salesman
wants to minimize the total length of the trip
• It is assumed that all cities are connected to all other cities
Travelling Salesman Problem
26

 Naive Solution:
1. Consider city 1 as the starting and ending point.
2. Generate all (n-1)! Permutations of cities.
3. Calculate cost of every permutation and keep track of
minimum cost permutation.
4. Return the permutation with minimum cost.

 Time Complexity: ?
n!
TSP using DP
27

• Assume tour to be a simple path that starts & ends at 1


• Every tour consists of
– an edge <1, k> for some k ∈ V − {1} and
– a path from vertex k to vertex 1
• The path from k to 1 goes through each vertex in V − {1, k} exactly once
• If the tour is optimal, then the path from k to 1 must be a shortest path
going through all vertices in V − {1, k}
• Let g(i, S) be the length of a shortest path starting at vertex i, going
through all vertices in S, and terminating at vertex 1
• g(1, V − {1}) is the length of an optimal salesperson tour
28

 The cost of a tour is the sum of the cost of the


edges on the tour. The traveling salesperson
problem is to find a tour of minimum cost.
TSP using DP
• From the principal of optimality
29

k = 2, c12 + g(2,{3,4,5})
k = 3, c13 + g(3,{2,4,5})
g(1,{2,3,4,5}) = min k = 4, c14 + g(4,{2,3,5})
k = 5, c15 + g(5,{2,3,4})
30
31
32

Successor of node 1: p(1,{2,3,4}) = 2


Successor of node 3: p(2, {3,4}) = 4
Successor of node 4: p(4, {3}) = 3
18CS42-Design and AnaO
lysp
istoifm
Ala
golritth
omusr 1-2-4-3-1Feb-May 2020
33
Knapsack problem using DP
34

• Given n items of known weights w1, . . . , wn and values


v1, . . . , vn and a knapsack of capacity W, find the most
valuable subset of the items that fit into the knapsack.
• To design a DP algorithm, we need to derive a
recurrence relation that expresses a solution in terms
of its smaller sub instances.
• Let us consider an instance defined by the first i items,
1≤ i ≤ n, with weights w1, . . . , wi, values v1, . . .
 , vi , and knapsack capacity j, 1 ≤ j ≤ W.
35

• Let F(i, j) be the value of an optimal solution to this instance.


• We can divide all the subsets of the first i items that fit the
knapsack of capacity j into two categories:
– those that do not include the ith item
– and those that do.
• Among the subsets that do not include the ith item,
– the value of an optimal subset is, by definition,
 – i.e F(i , j) = F(i − 1, j).
36

• Among the subsets that do include the ith item


 (hence, j − wi ≥ 0),
– an optimal subset is made up of this item and
– an optimal subset of the first i−1 items that fits into the
knapsack of capacity j − wi .
– The value of such an optimal subset is F(i , j)
= vi + F(i − 1, j − wi)
• Thus, optimal solution among all feasible subsets of the first I
items is the maximum of these two values.
37
38

• It is convenient to define the initial conditions


as follows: F(0, j) = 0 for j ≥ 0 and F(i, 0) = 0
for i ≥ 0.

• Our goal is to find F(n, W), the maximal value of


a subset of the n given items that fit into the
knapsack of capacity W, and an optimal subset
itself.
39
40
Example-1
41

Find the composition of an optimal subset by backtracing the computations


42

Find the composition of an optimal subset by backtracing the computations


Single Source shortest path with –ve wts
43

 Problem definition
• Single source shortest path - Given a graph and a source
vertex s in graph, find shortest paths from s to all vertices
in the given graph.
• The graph may contain negative weight edges.

• Dijkstra’s algorithm [ O(VlogV) ], doesn’t work for


graphs with negative weight edges.
• Bellman-Ford works in O(VE)
Bellman Ford Algorithm
44

• Like other Dynamic Programming Problems, the algorithm


calculates shortest paths in bottom-up manner.
• It first calculates the shortest distances
– for the shortest paths which have at-most one edge in the path.
– Then, it calculates shortest paths with at-most 2 edges, and so on.

• Iteration i finds all shortest paths that use i edges.


45
46
47
48

dist1(1) = 0
dist1(2) = 6
dist1(3) = 5
dist1(4) = 5
dist1(5) = ∞
dist1(6) = ∞
dist1(7) = ∞

k=1
For every vertex (except source) look for the
incoming edge (except from source)
49
dist2(1) = 0
dist1(2), 6
dist2(2) = min = min =3
dist1(3) + cost(3,2) 5−2
dist1(3) 5
dist2(3) = min = min =3
dist1(4) + cost(4,3) 5−2
dist2(4) = 5
dist1(5), ∞
dist2(5) = min dist1(2) + cost(2,5) = min 6−1 = 5
dist1(3) + cost(3,5) 5+1
dist1(6) ∞
dist2(6) = min = min =4
dist1(4) + cost(4,6) 5−1
dist1(7) ∞
dist2(7) = min dist1(5) + cost(5,7) = min ∞ + 3 = ∞
dist1(6) + cost(6,7) ∞+3

k=2
For every vertex (except source) look for the
incoming edge (except from source)
50

dist3(1) = 0 dist2(2) 3
dist3(2) = min = min =1
dist2(3) + cost(3,2) 3−2
dist2(3) 3
dist3(3) = min = min =3
dist2(4) + cost(4,3) 5−2
dist3(4) = 5
dist2(5) 5
dist3(5) = min dist2(2) + cost(2,5) = min 3−1 = 2
dist2(3) + cost(3,5) 3+1
3 dist2(6) 4
dist (6) = min = min =4
dist2(4) + cost(4,6) 5−1
dist2(7) ∞
dist3(7) = min dist2(5) + cost(5,7) = min 5 + 3 = 7
dist2(6) + cost(6,7) 4+3

k=3
For every vertex (except source) look for the
incoming edge (except from source)
dist4(1) = 0
51

dist3(2) 1
dist4(2) = min = min =1
dist3(3) + cost(3,2) 3−2
dist3(3) 3
dist4(3) = min = min =3
dist3(4) + cost(4,3) 5−2
dist4(4) = 5
dist3(5) 2
dist4(5) = min dist3(2) + cost(2,5) = min 1−1 = 0
dist3(3) + cost(3,5) 3+1
4 dist3(6) 4
dist (6) = min = min =4
dist3(4) + cost(4,6) 5−1
dist3(7) 7
dist4(7) = min dist3(5) + cost(5,7) = min 2 + 3 = 5
4+3
dist3(6) + cost(6,7)

k=4
For every vertex (except source) look for the
incoming edge (except from source)
dist5(1) = 0
52

dist4(2) 1
dist5(2) = min = min =1
dist4(3) + cost(3,2) 3−2
dist4(3) 3
dist5(3) = min = min =3
dist4(4) + cost(4,3) 5−2
dist5(4) = 5
dist4(5) 0
dist5(5) = min dist4(2) + cost(2,5) = min 1−1 = 0
dist4(3) + cost(3,5) 3+1
5
dist4(6) 4
dist (6) = min = min =4
dist4(4) + cost(4,6) 5−1
dist4(7) 5
dist5(7) = min dist4(5) + cost(5,7) = min 0 + 3 = 3
dist4(6) + cost(6,7) 4+3

k=5
For every vertex (except source) look for the
incoming edge (except from source)
dist6(1) = 0
53

dist5(2) 1
dist6(2) = min = min =1
dist5(3) + cost(3,2) 3−2
dist5(3) 3
dist6(3) = min = min =3
dist5(4) + cost(4,3) 5−2
dist6(4) = 5
dist5(5) 0
dist6(5) = min dist5(2) + cost(2,5) = min 1−1 = 0
dist5(3) + cost(3,5) 3+1
6 dist5(6) 4
dist (6) = min = min =4
dist5(4) + cost(4,6) 5−1
dist5(7) 5
dist6(7) = min dist5(5) + cost(5,7) = min 0 + 3 = 3
dist5(6) + cost(6,7) 4+3

k=6
54
Algorithm
55
Analysis
56

 Bellman-Ford works in O(VE)


57

1
Optimal Binary Search Tree
58

• A binary search tree is one of the most important data structures in


computer science.

• One of its principal applications is to implement a dictionary, a set of


elements with the operations of searching, insertion, and deletion.

• The binary search tree for which the average number of comparisons in a
search is the smallest as possible is called as optimal binary search tree.
• If probabilities of searching for elements of a set are known an optimal
binary search tree can be constructed.
Optimal Binary Search Tree
59 Example
• Consider four keys A, B, C, and D to be searched for with probabilities
0.1, 0.2, 0.4, and 0.3, respectively.
• 14 different binary search trees are possible
• Two are shown here

• The average number of comparisons in a successful search in the first of


these trees is
0.1 *1 + 0.2 * 2 + 0.4 *3+ 0.3 *4 = 2.9,
0.1 * 2 + 0.2 * 1+ 0.4 * 2 + 0.3 *3= 2.1.
Neither of these two trees is, in fact, optimal.
Optimal Binary Search Tree
60

• For our tiny example, we could find the optimal tree by generating all 14 binary
search trees with these keys.
• As a general algorithm, this exhaustive-search approach is unrealistic:

 – the total number of binary search trees with n keys is equal to the
nth Catalan number,

 which grows to infinity as fast as 4n / n1.5


Optimal Binary Search Tree
61

 Problem definition:
• Given a sorted array a1, . . . , an of search keys and an
array p1, . . . , pn of probabilities of searching,
 where pi is the probability of searches to ai.
• Construct a binary search tree of all keys such that,
smallest average number of comparisons made in
a successful search.
Optimal Binary Search Tree
62

• So let a1, . . . , an be distinct keys ordered from the


smallest to the largest and let p1, . . . , pn be the
probabilities of searching for them.
• Let C(i, j) be the smallest average number of
comparisons made in a successful search in a binary
search tree Ti j made up of keys ai, . . , aj, where i, j
are some integer indices, 1≤ i ≤ j ≤ n.
• We are interested just in C(1, n).
Optimal Binary Search Tree
63

 To derive a recurrence, we will consider all


possible ways to choose a root ak among the
keys ai, . . . , aj .
Optimal Binary Search Tree
64
65
Optimal Binary Search Tree
66
index 1 2 3 4
Example
67

k = 1, C(1,0) + C(2,2), 0.2


C(1,2) = Min + 0.3 = Min + 0.3 = 0.4
k = 2, C(1 ,1) + C(3,2) 0.1
index 1 2 3 4

Example
68
index 1 2 3 4

Example
69
index 1 2 3 4

Example
70
index 1 2 3 4

Example
71

How to construct the


optimal binary search tree?
72
73
Analysis
74

• The algorithm requires O(n2) time and O(n2) storage.


• Therefore, as ‘n’ increases it will run out of storage
even before it runs out of time.
• The storage needed can be reduced by almost half by
implementing the two-dimensional arrays as one-
dimensional arrays.
75

 Find the optimal binary search tree for the keys A, B,


C, D, E with search probabilities 0.1, 0.1, 0.2, 0.2, 0.4
respectively.

You might also like