Main Content

Nonlinear Constraint Solver Algorithms for Genetic Algorithm

Augmented Lagrangian Genetic Algorithm

By default, the genetic algorithm uses the Augmented Lagrangian Genetic Algorithm (ALGA) to solve nonlinear constraint problems without integer constraints. The optimization problem solved by the ALGA algorithm is

minxf(x)

such that

ci(x)0,i=1mceqi(x)=0,i=m+1mtAxbAeqx=beqlbxub,

where c(x) represents the nonlinear inequality constraints, ceq(x) represents the equality constraints, m is the number of nonlinear inequality constraints, and mt is the total number of nonlinear constraints.

The Augmented Lagrangian Genetic Algorithm (ALGA) attempts to solve a nonlinear optimization problem with nonlinear constraints, linear constraints, and bounds. In this approach, bounds and linear constraints are handled separately from nonlinear constraints. A subproblem is formulated by combining the fitness function and nonlinear constraint function using the Lagrangian and the penalty parameters. A sequence of such optimization problems are approximately minimized using the genetic algorithm such that the linear constraints and bounds are satisfied.

A subproblem formulation is defined as

Θ(x,λ,s,ρ)=f(x)i=1mλisilog(sici(x))+i=m+1mtλiceqi(x)+ρ2i=m+1mtceqi(x)2,

where

  • The components λi of the vector λ are nonnegative and are known as Lagrange multiplier estimates

  • The elements si of the vector s are nonnegative shifts

  • ρ is the positive penalty parameter.

The algorithm begins by using an initial value for the penalty parameter (InitialPenalty).

The genetic algorithm minimizes a sequence of subproblems, each of which is an approximation of the original problem. Each subproblem has a fixed value of λ, s, and ρ. When the subproblem is minimized to a required accuracy and satisfies feasibility conditions, the Lagrangian estimates are updated. Otherwise, the penalty parameter is increased by a penalty factor (PenaltyFactor). This results in a new subproblem formulation and minimization problem. These steps are repeated until the stopping criteria are met.

Each subproblem solution represents one generation. The number of function evaluations per generation is therefore much higher when using nonlinear constraints than otherwise.

Choose the Augmented Lagrangian algorithm by setting the NonlinearConstraintAlgorithm option to 'auglag' using optimoptions.

For a complete description of the algorithm, see the following references:

References

[1] Conn, A. R., N. I. M. Gould, and Ph. L. Toint. “A Globally Convergent Augmented Lagrangian Algorithm for Optimization with General Constraints and Simple Bounds,” SIAM Journal on Numerical Analysis, Volume 28, Number 2, pages 545–572, 1991.

[2] Conn, A. R., N. I. M. Gould, and Ph. L. Toint. “A Globally Convergent Augmented Lagrangian Barrier Algorithm for Optimization with General Inequality Constraints and Simple Bounds,” Mathematics of Computation, Volume 66, Number 217, pages 261–288, 1997.

Penalty Algorithm

The penalty algorithm is similar to the Integer ga Algorithm. In its evaluation of the fitness of an individual, ga computes a penalty value as follows:

  • If the individual is feasible, the penalty function is the fitness function.

  • If the individual is infeasible, the penalty function is the maximum fitness function among feasible members of the population, plus a sum of the constraint violations of the (infeasible) individual.

For details of the penalty function, see Deb [1].

Choose the penalty algorithm by setting the NonlinearConstraintAlgorithm option to 'penalty' using optimoptions. When you make this choice, ga solves the constrained optimization problem as follows.

  1. ga defaults to the @gacreationnonlinearfeasible creation function. This function attempts to create a feasible population with respect to all constraints. ga creates enough individuals to match the PopulationSize option. For details, see Penalty Algorithm.

  2. ga overrides your choice of selection function, and uses @selectiontournament with two individuals per tournament.

  3. ga proceeds according to How the Genetic Algorithm Works, using the penalty function as the fitness measure.

References

[1] Deb, Kalyanmoy. An efficient constraint handling method for genetic algorithms. Computer Methods in Applied Mechanics and Engineering, 186(2–4), pp. 311–338, 2000.

Related Topics