Note: This page has been translated by MathWorks. Please click here

To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.

To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.

Suppose you want to minimize the simple fitness function of
two variables *x*_{1} and *x*_{2},

$$\underset{x}{\mathrm{min}}f(x)=100{\left({x}_{1}^{2}-{x}_{2}\right)}^{2}+{(1-{x}_{1})}^{2}$$

$$\begin{array}{ll}{x}_{1}\cdot {x}_{2}+{x}_{1}-{x}_{2}+1.5\le 0\hfill & \text{(nonlinearconstraint)}\hfill \\ 10-{x}_{1}\cdot {x}_{2}\le 0\hfill & \text{(nonlinearconstraint)}\hfill \\ 0\le {x}_{1}\le 1\hfill & \text{(bound)}\hfill \\ 0\le {x}_{2}\le 13\hfill & \text{(bound)}\hfill \end{array}$$

Begin by creating the fitness and constraint functions. First,
create a file named `simple_fitness.m`

as follows:

function y = simple_fitness(x) y = 100*(x(1)^2 - x(2))^2 + (1 - x(1))^2;

`simple_fitness.m`

ships
with Global Optimization
Toolbox software.)The genetic algorithm function, `ga`

, assumes
the fitness function will take one input `x`

, where `x`

has
as many elements as the number of variables in the problem. The fitness
function computes the value of the function and returns that scalar
value in its one return argument, `y`

.

Then create a file, `simple_constraint.m`

,
containing the constraints

function [c, ceq] = simple_constraint(x) c = [1.5 + x(1)*x(2) + x(1) - x(2);... -x(1)*x(2) + 10]; ceq = [];

The `ga`

function assumes the constraint
function will take one input `x`

, where `x`

has
as many elements as the number of variables in the problem. The constraint
function computes the values of all the inequality and equality constraints
and returns two vectors, `c`

and `ceq`

,
respectively.

To minimize the fitness function, you need to pass a function
handle to the fitness function as the first argument to the `ga`

function,
as well as specifying the number of variables as the second argument.
Lower and upper bounds are provided as `LB`

and `UB`

respectively.
In addition, you also need to pass a function handle to the nonlinear
constraint function.

ObjectiveFunction = @simple_fitness; nvars = 2; % Number of variables LB = [0 0]; % Lower bound UB = [1 13]; % Upper bound ConstraintFunction = @simple_constraint; rng(1,'twister') % for reproducibility [x,fval] = ga(ObjectiveFunction,nvars,... [],[],[],[],LB,UB,ConstraintFunction) Optimization terminated: average change in the fitness value less than options.FunctionTolerance and constraint violation is less than options.ConstraintTolerance. x = 0.8123 12.3137 fval = 1.3581e+04

For problems without integer constraints, the genetic algorithm solver handles linear
constraints and bounds differently from nonlinear constraints. All the linear
constraints and bounds are satisfied throughout the optimization. However,
`ga`

may not satisfy all the nonlinear constraints at every
generation. If `ga`

converges to a solution, the nonlinear
constraints will be satisfied at that solution.

If there are integer constraints, `ga`

does not enforce the
feasibility of linear constraints, and instead adds any linear constraint violations to
the penalty function. See Integer ga Algorithm.

`ga`

uses the mutation and crossover functions
to produce new individuals at every generation. `ga`

satisfies
linear and bound constraints by using mutation and crossover functions
that only generate feasible points. For example, in the previous call
to `ga`

, the mutation function `mutationguassian`

does
not necessarily obey the bound constraints. So when there are bound
or linear constraints, the default `ga`

mutation
function is `mutationadaptfeasible`

. If you provide
a custom mutation function, this custom function must only generate
points that are feasible with respect to the linear and bound constraints.
All the included crossover functions generate points that satisfy
the linear constraints and bounds except the `crossoverheuristic`

function.

To see the progress of the optimization, use the `optimoptions`

function
to create options that select two plot functions. The first plot function
is `gaplotbestf`

, which plots the best and mean score
of the population at every generation. The second plot function is `gaplotmaxconstr`

,
which plots the maximum constraint violation of nonlinear constraints
at every generation. You can also visualize the progress of the algorithm
by displaying information to the command window using the `'Display'`

option.

options = optimoptions('ga','PlotFcn',{@gaplotbestf,@gaplotmaxconstr},'Display','iter');

Rerun the `ga`

solver.

[x,fval] = ga(ObjectiveFunction,nvars,[],[],[],[],... LB,UB,ConstraintFunction,options) Best Max Stall Generation Func-count f(x) Constraint Generations 1 2670 13603.6 0 0 2 5282 13578.2 5.718e-06 0 3 7994 14033.9 0 0 4 11794 13573.7 0.0009577 0 Optimization terminated: average change in the fitness value less than options.FunctionTolerance and constraint violation is less than options.ConstraintTolerance. x = 0.8122 12.3104 fval = 1.3574e+04

You can provide a start point for the minimization to the `ga`

function
by specifying the `InitialPopulationMatrix`

option.
The `ga`

function will use the first individual
from `InitialPopulationMatrix`

as a start point for
a constrained minimization.

X0 = [0.5 0.5]; % Start point (row vector) options = optimoptions('ga',options,'InitialPopulationMatrix',X0);

Now, rerun the `ga`

solver.

[x,fval] = ga(ObjectiveFunction,nvars,[],[],[],[],... LB,UB,ConstraintFunction,options) Best Max Stall Generation Func-count f(x) Constraint Generations 1 2670 13578.1 0.0005269 0 2 5282 13578.2 1.815e-05 0 3 8494 14031.3 0 0 4 14356 14054.9 0 0 5 18706 13573.5 0.0009986 0 Optimization terminated: average change in the fitness value less than options.FunctionTolerance and constraint violation is less than options.ConstraintTolerance. x = 0.8122 12.3103 fval = 1.3573e+04

Was this topic helpful?