## Global Optimization Toolbox |

This example shows how to minimize an objective function subject to nonlinear inequality constraints and bounds using the Genetic Algorithm.

We want to minimize a simple fitness function of two variables x1 and x2

min f(x) = 100 * (x1^2 - x2) ^2 + (1 - x1)^2; x

such that the following two nonlinear constraints and bounds are satisfied

x1*x2 + x1 - x2 + 1.5 <=0, (nonlinear constraint) 10 - x1*x2 <=0, (nonlinear constraint) 0 <= x1 <= 1, and (bound) 0 <= x2 <= 13 (bound)

The above fitness function is known as 'cam' as described in L.C.W. Dixon and G.P. Szego (eds.), Towards Global Optimisation 2, North-Holland, Amsterdam, 1978.

We create a MATLAB file named simple_fitness.m with the following code in it:

function y = simple_fitness(x) y = 100 * (x(1)^2 - x(2)) ^2 + (1 - x(1))^2;

The Genetic Algorithm function `ga`

assumes the fitness function will take one input x where x has as many elements as number of variables in the problem. The fitness function computes the value of the function and returns that scalar value in its one return argument y.

We create a MATLAB file named simple_constraint.m with the following code in it:

function [c, ceq] = simple_constraint(x) c = [1.5 + x(1)*x(2) + x(1) - x(2); -x(1)*x(2) + 10]; ceq = [];

The `ga`

function assumes the constraint function will take one input x where x has as many elements as number of variables in the problem. The constraint function computes the values of all the inequality and equality constraints and returns two vectors c and ceq respectively.

`ga`

To minimize our fitness function using the `ga`

function, we need to pass in a function handle to the fitness function as well as specifying the number of variables as the second argument. Lower and upper bounds are provided as LB and UB respectively. In addition, we also need to pass in a function handle to the nonlinear constraint function.

ObjectiveFunction = @simple_fitness; nvars = 2; % Number of variables LB = [0 0]; % Lower bound UB = [1 13]; % Upper bound ConstraintFunction = @simple_constraint; [x,fval] = ga(ObjectiveFunction,nvars,[],[],[],[],LB,UB, ... ConstraintFunction)

Optimization terminated: average change in the fitness value less than options.TolFun and constraint violation is less than options.TolCon. x = 0.8122 12.3104 fval = 1.3574e+04

Note that for our constrained minimization problem, the `ga`

function changed the mutation function to @mutationadaptfeasible. The default mutation function, @mutationgaussian, is only appropriate for unconstrained minimization problems.

`ga`

Operators for Constrained MinimizationThe `ga`

solver handles linear constraints and bounds differently from nonlinear constraints. All the linear constraints and bounds are satisfied throughout the optimization. However, `ga`

may not satisfy all the nonlinear constraints at every generation. If `ga`

converges to a solution, the nonlinear constraints will be satisfied at that solution.

`ga`

uses the mutation and crossover functions to produce new individuals at every generation. The way the `ga`

satisfies the linear and bound constraints is to use mutation and crossover functions that only generate feasible points. For example, in the previous call to `ga`

, the default mutation function MUTATIONGAUSSIAN will not satisfy the linear constraints and so the MUTATIONADAPTFEASIBLE is used instead. If you provide a custom mutation function, this custom function must only generate points that are feasible with respect to the linear and bound constraints. All the crossover functions in the toolbox generate points that satisfy the linear constraints and bounds.

We specify MUTATIONADAPTFEASIBLE as the mutation function for our minimization problem by using `gaoptimset`

function.

options = gaoptimset('MutationFcn',@mutationadaptfeasible); % Next we run the GA solver. [x,fval] = ga(ObjectiveFunction,nvars,[],[],[],[],LB,UB, ... ConstraintFunction,options)

Optimization terminated: average change in the fitness value less than options.TolFun and constraint violation is less than options.TolCon. x = 0.8122 12.3103 fval = 1.3573e+04

Next we use `gaoptimset`

to create an options structure to select two plot functions. The first plot function is GAPLOTBESTF, which plots the best and mean score of the population at every generation. The second plot function is GAPLOTMAXCONSTR, which plots the maximum constraint violation of nonlinear constraints at every generation. We can also visualize the progress of the algorithm by displaying information to the command window using the 'Display' option.

options = gaoptimset(options,'PlotFcns',{@gaplotbestf,@gaplotmaxconstr}, ... 'Display','iter'); % Next we run the GA solver. [x,fval] = ga(ObjectiveFunction,nvars,[],[],[],[],LB,UB, ... ConstraintFunction,options)

Best max Stall Generation f-count f(x) constraint Generations 1 2674 13578.5 0 0 2 5286 13578.2 1.485e-05 0 3 7898 13883.3 0 0 4 14148 13573.6 0.000999 0 Optimization terminated: average change in the fitness value less than options.TolFun and constraint violation is less than options.TolCon. x = 0.8123 12.3103 fval = 1.3574e+04

A start point for the minimization can be provided to `ga`

function by specifying the InitialPopulation option. The `ga`

function will use the first individual from InitialPopulation as a start point for a constrained minimization. Refer to the documentation for a description of specifying an initial population to `ga`

.

X0 = [0.5 0.5]; % Start point (row vector) options = gaoptimset(options,'InitialPopulation',X0); % Next we run the GA solver. [x,fval] = ga(ObjectiveFunction,nvars,[],[],[],[],LB,UB, ... ConstraintFunction,options)

Best max Stall Generation f-count f(x) constraint Generations 1 2670 13578.1 0.0005448 0 2 5282 13578.2 8.021e-06 0 3 8394 14034.4 0 0 4 16256 14052.7 0 0 5 18856 13573.5 0.0009913 0 Optimization terminated: average change in the fitness value less than options.TolFun and constraint violation is less than options.TolCon. x = 0.8122 12.3103 fval = 1.3573e+04