## Documentation Center |

The genetic algorithm uses the Augmented Lagrangian Genetic Algorithm (ALGA) to solve nonlinear constraint problems without integer constraints. The optimization problem solved by the ALGA algorithm is

such that

where *c*(*x*) represents
the nonlinear inequality constraints, *ceq*(*x*)
represents the equality constraints, *m* is the number
of nonlinear inequality constraints, and *mt* is
the total number of nonlinear constraints.

The Augmented Lagrangian Genetic Algorithm (ALGA) attempts to solve a nonlinear optimization problem with nonlinear constraints, linear constraints, and bounds. In this approach, bounds and linear constraints are handled separately from nonlinear constraints. A subproblem is formulated by combining the fitness function and nonlinear constraint function using the Lagrangian and the penalty parameters. A sequence of such optimization problems are approximately minimized using the genetic algorithm such that the linear constraints and bounds are satisfied.

A subproblem formulation is defined as

where

The components

*λ*of the vector_{i}*λ*are nonnegative and are known as Lagrange multiplier estimatesThe elements

*s*of the vector_{i}*s*are nonnegative shifts*ρ*is the positive penalty parameter.

The algorithm begins by using an initial value for
the penalty parameter (`InitialPenalty`).

The genetic algorithm minimizes a sequence of subproblems, each
of which is an approximation of the original problem. Each subproblem
has a fixed value of *λ*, *s*,
and *ρ*. When the subproblem is minimized to
a required accuracy and satisfies feasibility conditions, the Lagrangian
estimates are updated. Otherwise, the penalty parameter is increased
by a penalty factor (`PenaltyFactor`). This results
in a new subproblem formulation and minimization problem. These steps
are repeated until the stopping criteria are met.

Each subproblem solution represents one generation. The number of function evaluations per generation is therefore much higher when using nonlinear constraints than otherwise.

For a complete description of the algorithm, see the following references:

[1] Conn, A. R., N. I. M. Gould, and Ph. L.
Toint. "A Globally Convergent Augmented Lagrangian Algorithm
for Optimization with General Constraints and Simple Bounds," *SIAM
Journal on Numerical Analysis*, Volume 28, Number 2, pages
545–572, 1991.

[2] Conn, A. R., N. I. M. Gould, and Ph. L.
Toint. "A Globally Convergent Augmented Lagrangian Barrier
Algorithm for Optimization with General Inequality Constraints and
Simple Bounds," *Mathematics of Computation*,
Volume 66, Number 217, pages 261–288, 1997.

Was this topic helpful?