Documentation

This is machine translation

Translated by Microsoft
Mouseover text to see original. Click the button below to return to the English verison of the page.

Note: This page has been translated by MathWorks. Please click here
To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.

Genetic Algorithm Options

This example shows how to create and manage options for the genetic algorithm function ga using optimoptions in the Global Optimization Toolbox.

Setting Up a Problem for ga

ga searches for a minimum of a function using the genetic algorithm. For this example we will use ga to minimize the fitness function shufcn. shufcn is a real valued function of two variables.

We can use the function plotobjective in the toolbox to plot the function shufcn over the range = [-2 2;-2 2].

plotobjective(@shufcn,[-2 2; -2 2]);

To use the ga solver, we need to provide at least two input arguments, a fitness function and the number of variables in the problem. The first two output arguments returned by ga are x, the best point found, and Fval, the function value at the best point. A third output argument, exitFlag tells you the reason why ga stopped. ga can also return a fourth argument, Output, which contains information about the performance of the solver.

FitnessFunction = @shufcn;
numberOfVariables = 2;

Run the ga solver.

[x,Fval,exitFlag,Output] = ga(FitnessFunction,numberOfVariables);

fprintf('The number of generations was : %d\n', Output.generations);
fprintf('The number of function evaluations was : %d\n', Output.funccount);
fprintf('The best function value found was : %g\n', Fval);
Optimization terminated: average change in the fitness value less than options.FunctionTolerance.
The number of generations was : 124
The number of function evaluations was : 6250
The best function value found was : -186.199

Note that when you run this example, your result may be different from the results shown; This will be explained in a section later in this example.

How the Genetic Algorithm Works

The Genetic Algorithm works on a population using a set of operators that are applied to the population. A population is a set of points in the design space. The initial population is generated randomly by default. The next generation of the population is computed using the fitness of the individuals in the current generation.

Adding Visualization

ga can accept one or more plot functions through an options argument. This feature is useful for visualizing the performance of the solver at run time. Plot functions can be selected using optimoptions.

Here we use optimoptions to select two plot functions. The first plot function is gaplotbestf, which plots the best and mean score of the population at every generation. The second plot function is gaplotstopping, which plots the percentage of stopping criteria satisfied.

opts = optimoptions(@ga,'PlotFcn',{@gaplotbestf,@gaplotstopping});

Run the ga solver.

[x,Fval,exitFlag,Output] = ...
    ga(FitnessFunction,numberOfVariables,[],[],[],[],[],[],[],opts);
Optimization terminated: average change in the fitness value less than options.FunctionTolerance.

Specifying Population Options

The default initial population is created using a uniform random number generator. Default values for the population size and the range of the initial population are used to create the initial population.

Specify a population size

The default population size used by ga is 50 when the number of decision variables is less that 5 and 200 otherwise. This may not be sufficient for all problems; a smaller population size may be sufficient for smaller problems. Since we only have two variables, we specify a population size of 10. We will directly set the value of the option PopulationSize to 10 in our previously created options, opts.

opts.PopulationSize = 10;

Specify initial population range

The default method for generating an initial population uses a uniform random number generator. This creates an initial population where all the points are in the range 0 to 1. For example, a population of size 3 in a problem with two variables could look like:

Population = rand(3,2)
Population =

    0.0149    0.0858
    0.3852    0.9966
    0.3954    0.4020

The initial range can be set by changing the InitialPopulationRange option. The range must be a matrix with two rows. If the range has only one column, i.e., it is 2-by-1, then the range of every variable is the given range. For example, if we set the range to [-1; 1], then the initial range for both our variables is -1 to 1. To specify a different initial range for each variable, the range must be specified as a matrix with two rows and numberOfVariables columns. For example if we set the range to [-1 0; 1 2], then the first variable will be in the range -1 to 1, and the second variable will be in the range 0 to 2 (so each column corresponds to a variable).

We will directly modify the value of the option InitialPopulationRange in our previously created options, opts.

opts.InitialPopulationRange = [-1 0; 1 2];

Run the ga solver.

[x,Fval,exitFlag,Output] = ga(FitnessFunction,numberOfVariables,[],[],[], ...
    [],[],[],[],opts);

fprintf('The number of generations was : %d\n', Output.generations);
fprintf('The number of function evaluations was : %d\n', Output.funccount);
fprintf('The best function value found was : %g\n', Fval);
Optimization terminated: average change in the fitness value less than options.FunctionTolerance.
The number of generations was : 67
The number of function evaluations was : 680
The best function value found was : -179.987

Reproducing Your Results

By default, ga starts with a random initial population which is created using MATLAB® random number generators. The next generation is produced using ga operators that also use these same random number generators. Every time a random number is generated, the state of the random number generators change. This means that even if you do not change any options, when you run again you may get different results.

Here we run the solver twice to show this phenomenon.

Run the ga solver.

[x,Fval,exitFlag,Output] = ga(FitnessFunction,numberOfVariables);
fprintf('The best function value found was : %g\n', Fval);
Optimization terminated: average change in the fitness value less than options.FunctionTolerance.
The best function value found was : -186.484

Run ga again.

[x,Fval,exitFlag,Output] = ga(FitnessFunction,numberOfVariables);
fprintf('The best function value found was : %g\n', Fval);
Optimization terminated: average change in the fitness value less than options.FunctionTolerance.
The best function value found was : -185.867

In the previous two runs ga might give different results. The results are different because the states of the random number generators have changed from one run to another.

If you know that you want to reproduce your results before you run ga, you can save the state of the random number stream.

thestate = rng;

Run ga.

[x,Fval,exitFlag,Output] = ga(FitnessFunction,numberOfVariables);
fprintf('The best function value found was : %g\n', Fval);
Optimization terminated: average change in the fitness value less than options.FunctionTolerance.
The best function value found was : -186.467

Reset the stream and rerun ga. The results are identical to the previous run.

rng(thestate);
[x,Fval,exitFlag,Output] = ga(FitnessFunction,numberOfVariables);
fprintf('The best function value found was : %g\n', Fval);
Optimization terminated: average change in the fitness value less than options.FunctionTolerance.
The best function value found was : -186.467

However, you might not have realized that you would want to try to reproduce the results before running ga. In that case, as long as you have the output structure, you can reset the random number generator as follows.

strm = RandStream.getGlobalStream;
strm.State = Output.rngstate.state;

Rerun ga. Again, the results are identical.

[x,Fval,exitFlag,Output] = ga(FitnessFunction,numberOfVariables);
fprintf('The best function value found was : %g\n', Fval);
Optimization terminated: average change in the fitness value less than options.FunctionTolerance.
The best function value found was : -186.467

Modifying the Stopping Criteria

ga uses four different criteria to determine when to stop the solver. ga stops when the maximum number of generations is reached; by default this number is 100. ga also detects if there is no change in the best fitness value for some time given in seconds (stall time limit), or for some number of generations (maximum stall generations). Another criteria is the maximum time limit in seconds. Here we modify the stopping criteria to increase the maximum number of generations to 150 and the maximum stall generations to 100.

opts = optimoptions(opts,'MaxGenerations',150,'MaxStallGenerations', 100);

Run the ga solver again.

[x,Fval,exitFlag,Output] = ga(FitnessFunction,numberOfVariables,[],[],[], ...
    [],[],[],[],opts);

fprintf('The number of generations was : %d\n', Output.generations);
fprintf('The number of function evaluations was : %d\n', Output.funccount);
fprintf('The best function value found was : %g\n', Fval);
Optimization terminated: maximum number of generations exceeded.
The number of generations was : 150
The number of function evaluations was : 1510
The best function value found was : -186.692

Choosing ga Operators

ga starts with a random set of points in the population and uses operators to produce the next generation of the population. The different operators are scaling, selection, crossover, and mutation. The toolbox provides several functions to choose from for each operator. Here we choose fitscalingprop for FitnessScalingFcn and selectiontournament for SelectionFcn.

opts = optimoptions(@ga,'SelectionFcn',@selectiontournament, ...
                        'FitnessScalingFcn',@fitscalingprop);

Run the ga solver.

[x,Fval,exitFlag,Output] = ga(FitnessFunction,numberOfVariables,[],[],[], ...
    [],[],[],[],opts);

fprintf('The number of generations was : %d\n', Output.generations);
fprintf('The number of function evaluations was : %d\n', Output.funccount);
fprintf('The best function value found was : %g\n', Fval);
Optimization terminated: average change in the fitness value less than options.FunctionTolerance.
The number of generations was : 144
The number of function evaluations was : 7250
The best function value found was : -173.284

The best function value may improve or it may get worse by choosing different operators. Choosing a good set of operators for your problem is often best done by experimentation.

Was this topic helpful?