×

Search anything:

Knuth's optimization in Dynamic Programming

Internship at OpenGenus

Get this book -> Problems on Array: For Interviews and Competitive Programming

Knuth's optimization is used to optimize the run-time of a subset of Dynamic programming problems from O(N^3) to O(N^2).

Properties of functions

Some properties of two-variable functions required for Kunth's optimzation:

1. Quadrangle inequalities

For a two-variable function $f(x, y)$ :

  • Convex quandrangle inequality : $\forall_{a \leq b \leq c \leq d}(f(a, c) + f(b, d) \leq f(a, d) + f(b, c))$

  • Concave quadrangle inequality : $\forall_{a \leq b \leq c \leq d}(f(a, c) + f(b, d) \geq f(a, d) + f(b, c))$

2. Monotonicity

A two-variable function $f(x, y)$ is said to be convex-monotone if:
$$\forall_{a \leq b \leq c \leq d}(f(b, c) \leq f(a, d))$$
And it is said to me concave-monotone if:
$$\forall_{a \leq b \leq c \leq d}(f(b, c) \geq f(a, d))$$

xD/yD dynamic programming

An xD/yD dynamic programming problem is one where there are $O(n^x)$ subproblems, each of them are calculated using $O(n^y)$ subproblems.

Example - Longest Increasing Subsequence or LIS:
We are given an array with $n$ numbers: $a[0…n−1]$. The task is to find the longest, strictly increasing, subsequence in $a$.
This is a standard problem, with the recurrence:
$$
lis(i) =
\begin{cases}
1 & i = 0 \newline
max(1, max_{\substack{j = 0..i-1\newline a[j] \lt a[i]}}(lis(j) + 1)) & i = 1..n-1
\end{cases}
$$
This an example of 1D/1D dynamic programming because there are $O(n)$ subproblems ($i = 0..n-1$), each depending on $O(n)$ subproblems ($j = 0..i-1$).

Optimization criteria

A 2D/1D dynamic programming problem of the form:
$$
dp(i, j) = min_{i \leq k \leq j}(f(i, j, k))
$$
where,
$$
f(i, j, k) = dp(i, k) + dp(k, j) + cost(i, j)
$$
Can be optimized using Knuth's optimization if the function $cost(x, y)$ satisfies the convex quadrangle inequality and is convex-monotone.

It can be noticed that to solve the above problem it takes $O(n^3)$ time to be solved.

Note: Concave quadrangle inequality should be satisfied in case of maximization problem.

Optimization

If $cost(x, y)$ satisfies the stated properties, then $dp(x, y)$ also satisfies the quadrangle inequality, this results in another useful property (proof in 1):
$$
h(i, j - 1) \leq h(i, j) \leq h(i + 1, j) \text{, } i \leq j
$$
where,
$$
h(i, j) = argmin_{i \lt k \lt j} (f(i, j, k))
$$
$h(x, y)$ is the position where $dp(x, y)$ is optimal. From the above property, it can be understood that the solution for $dp(i, j)$ occurs somewhere between where the solutions for $dp(i, j - 1)$ and $dp(i + 1, j)$ occurs.
Therefore while computing $dp(i, j)$, $k$ can only take the values between $h(i, j - 1)$ and $h(i + 1, j)$. This can be used to obtain an amortized complexity of $O(n^2)$, if $dp(i, j)$ is computed in the order of increasing $j−i$.

Complexity

From the property of $h(i, j)$, it can be concluded that $h(i, j)$ is non-decreasing along each row and column, as a consequence, when we compute $\forall_{j - i = 0..n-1}dp(i, j)$: Only $h(i + 1, j) - h(i, j - 1)$ minimization operations needs to be performed for computing $dp(i, j)$, hence for a fixed $j - i$, the total amount of work done is $O(n)$; the overall time complexity therefore is, $O(n^2)$.

Example (Breaking String)

Problem:
Consider the problem of breaking a string into smaller substrings, where the cost of breaking a string is equal to it's length. Given a string and the points (or indexes) where it has to be broken, compute the minimum cost of breaking the string.
Example:
If the string is: "optimization" and the points of division are [1, 3, 10] (assuming zero indexed), then the minimum cost is 24, which happens when the order of breaking is [3, 1, 10] - cost of the first break will be 12, second break will be 3 and the last break will be 9.
Given the string length $m$ and $n$ breaking points as $points[0..n-1]$ in ascending order, find the minimum amount of time to break the string.

Solution:
State: For all the break points $0..n-1$, $dp[l][r]$ stores the optimal result for the substring between the break points $l$ and $r$.
Transition: $dp[l][r]$ can be computed by iterating through all the break points $k$, lying inbetween $l$ and $r$:
$$
dp[l][r] = min_{l \leq k \leq r}(dp[l][k] + dp[k][r]) + (points[r] - points[l])
$$
Unoptimized implementation:

const int N = MAX_BREAK_POINTS;

int64_t dp[N][N];

int64_t breakString(int m, vector<int>& points) {
  int n = points.size();
  points.insert(points.begin(), 0);
  points.push_back(m);
  for(int s = 0; s <= n + 1; ++s) {
    for(int l = 0; l + s <= n + 1; ++l) {
      int r = l + s;
      if(s < 2) {
        dp[l][r] = 0;
        continue;
      }
      dp[l][r] = 1ll << 60;
      for(int k = l; k <= r; ++k) {
        int64_t cur = dp[l][k] + dp[k][r] + (points[r] - points[l]);
        if(cur < dp[l][r]) {
          dp[l][r] = cur;
        }
      }
    }
  }
  return dp[0][n + 1];
}

The code has a s-loop, which iterates through the size of substring, this is to fill the dp table in the increasing order of r - l.
The above implementation takes $O(n^3)$ time.

Notice that the recurrence is a 2D/1D problem, with $cost(x, y)$ as $points[y] - points[x]$. As $points[0..n-1]$ is in ascending order, $cost(x, y)$ satisfies the criteria being optimized using Knuth's optimization.

const int N = MAX_BREAK_POINTS;

int64_t dp[N][N];
int h[N][N];

int64_t breakString(int m, vector<int>& points) {
  int n = points.size();
  points.insert(points.begin(), 0);
  points.push_back(m);
  for(int s = 0; s <= n + 1; ++s) {
    for(int l = 0; l + s <= n + 1; ++l) {
      int r = l + s;
      if(s < 2) {
        dp[l][r] = 0;
        h[l][r] = l;
        continue;
      }
      dp[l][r] = 1ll << 60;
      for(int k = h[l][r - 1]; k <= h[l + 1][r]; ++k) {
        int64_t cur = dp[l][k] + dp[k][r] + (points[r] - points[l]);
        if(dp[l][r] > cur) {
          dp[l][r] = cur;
          h[l][r] = k;
        }
      }
    }
  }
  return dp[0][n + 1];
}

The above code uses an auxillary table h, to store the location at which the minimum value occurs (required for Knuth's optimization). The only variation compared to the unoptimized approach is the innermost loop (where the optimization is applied).

Question

If the overall complexity of breakString is $O(n^2)$, then the inner-most loop in the function has an amortized complexity of

O(n)
O(1)
O(n^2)
O(logn)
As the outer loops has a complexity of $O(n^2)$ (same as overall), the inner-most loop must be $O(1)$.

Question

What is the space complexity of breakString?

O(1)
O(n^3)
O(n^2)
O(nlogn)
The algorithm uses two auxillary tables each of space complexity $O(n^2)$ - $dp$ and $h$.

With this article at OpenGenus, you must have the complete idea of Knuth's optimization in Dynamic Programming. Enjoy.

Knuth's optimization in Dynamic Programming
Share this