Approximate Algorithms

 

An approximate algorithm is a way of dealing with NP-completeness for optimization problem. This technique does not guarantee the best solution. The goal of an approximation algorithm is to come as close as possible to the optimum value in a reasonable amount of time which is at most polynomial time.

Suppose we have some optimization problem instance i, which has a large number of feasible solutions. Also let c(i) be the cost of solution produced by approximate algorithm and c*(i) be the cost of optimal solution. For minimization problem, we are interested in finding a solution of a given instance i in the set of feasible solutions, such that c(i)/c*(i) be as small as possible. On the other hand, for maximization problem, we are interested in finding a solution in the feasible solution set such that c*(i)/c(i) be as small as possible.

We say that an approximation algorithm for the given problem instance i, has a ratio bound of p(n) if for any input of sign n, the cost c of the solution produced by the approximation algorithm is within a factor of p(n) of the cost c* of  an optimal solution. That is
            max(c(i)/c*(i), c*(i)/c(i)) ≤ p(n)

This definition applies for both minimization and maximization problems.

Note that  p(n) is always greater than or equal to 1. If solution produced by approximation algorithm is true optimal solution then clearly we have  p(n) = 1.

For a minimization problem, 0 <  c*(i) ≤ c(i), and the ratio c(i)/c*(i) gives the factor by which the cost of the approximate solution is larger than the cost of an optimal solution. Similarly, for a maximization problem, 0 <  c(i) ≤ c*(i), and the ratio c*(i)/c(i) gives the factor by which the cost of an optimal solution is larger than the cost of the approximate solution.

Relative Error

We define the relative error of the approximate algorithm for any input size as

mod[c(i) - c*(i)  c*(i)]

We say that an approximate algorithm has a relative bound of ε(n) if

    mod[c(i) - c*(i)  c*(i)]   ε(n)