# Documentation

## Optimization of Stochastic Objective Function

This example shows how to find a minimum of a stochastic objective function using PATTERNSEARCH function in the Global Optimization Toolbox. We also show why the Optimization Toolbox™ functions are not suitable for this kind of problems. A simple 2-dimensional optimization problem is selected for this example to help visualize the objective function.

Initialization

```X0 = [2.5 -2.5]; % Starting point. LB = [-5 -5]; % Lower bound UB = [5 5]; % Upper bound range = [LB(1) UB(1); LB(2) UB(2)]; Objfcn = @smoothFcn; % Handle to the objective function. % Plot the smooth objective function fig = figure('Color','w'); showSmoothFcn(Objfcn,range); hold on; title('Smooth objective function'); ph = []; ph(1) = plot3(X0(1),X0(2),Objfcn(X0)+30,'or','MarkerSize',10,'MarkerFaceColor','r'); hold off; ax = gca; ax.CameraPosition = [-31.0391 -85.2792 -281.4265]; ax.CameraTarget = [0 0 -50]; ax.CameraViewAngle = 6.7937; % Add legend information legendLabels = {'Start point'}; lh = legend(ph,legendLabels,'Location','SouthEast'); lp = lh.Position; lh.Position = [1-lp(3)-0.005 0.005 lp(3) lp(4)]; ```

Run FMINCON on a Smooth Objective Function

The objective function is smooth (twice continuously differentiable). We will solve the optimization problem using FMINCON function from the Optimization Toolbox. FMINCON finds a constrained minimum of a function of several variables. This function has a unique minimum at the point x* = (-5.0,-5) where it has a function value f(x*) = -250.

```% Set options to display iterative results. options = optimoptions(@fmincon,'Algorithm','active-set','Display','iter'); [Xop,Fop] = fmincon(Objfcn,X0,[],[],[],[],LB,UB,[],options) figure(fig); hold on; % Plot the final point ph(2) = plot3(Xop(1),Xop(2),Fop,'dm','MarkerSize',10,'MarkerFaceColor','m'); % Add legend to plot legendLabels = [legendLabels, 'FMINCON solution']; lh = legend(ph,legendLabels,'Location','SouthEast'); lp = lh.Position; lh.Position = [1-lp(3)-0.005 0.005 lp(3) lp(4)]; hold off; ```
``` Max Line search Directional First-order Iter F-count f(x) constraint steplength derivative optimality Procedure 0 3 -10.625 -2.5 1 6 -250 0 1 -23.4 82.1 Local minimum found that satisfies the constraints. Optimization completed because the objective function is non-decreasing in feasible directions, to within the default value of the function tolerance, and constraints are satisfied to within the default value of the constraint tolerance. Active inequalities (to within options.TolCon = 1e-06): lower upper ineqlin ineqnonlin 1 2 Xop = -5 -5 Fop = -250 ```

Stochastic Objective Function

The objective function we use now is the same as the previous example but with some random noise added to it. This is done by adding a random component to the function value.

```rng(0,'twister') % Reset the global random number generator peaknoise = 4.5; Objfcn = @(x) smoothFcn(x,peaknoise); % Handle to the objective function. % Plot the objective function (non-smooth) fig = figure('Color','w'); showSmoothFcn(Objfcn,range); title('Stochastic objective function') ax = gca; ax.CameraPosition = [-31.0391 -85.2792 -281.4265]; ax.CameraTarget = [0 0 -50]; ax.CameraViewAngle = 6.7937; ```

Run FMINCON on a Stochastic Objective Function

The objective function is stochastic and not smooth. FMINCON is a general constrained optimization solver which finds a local minima using first derivative of the objective function. If derivative of the objective function is not provided, FMINCON uses finite difference to approximate first derivative of the objective function. In this example, the objective function have some random noise in it. The derivatives hence could be highly unreliable. FMINCON can potentially stop at a point which is not a minimum. This may happen because the optimal conditions seems to be satisfied at the final point because of noise or it could not make any progress.

```options = optimoptions(@fmincon,'Algorithm','active-set','Display','iter'); [Xop,Fop] = fmincon(Objfcn,X0,[],[],[],[],LB,UB,[],options) figure(fig); hold on; ph = []; ph(1) = plot3(X0(1),X0(2),Objfcn(X0)+30,'or','MarkerSize',10,'MarkerFaceColor','r'); ph(2) = plot3(Xop(1),Xop(2),Fop,'dm','MarkerSize',10,'MarkerFaceColor','m'); % Add legend to plot legendLabels = {'Start point','FMINCON solution'}; lh = legend(ph,legendLabels,'Location','SouthEast'); lp = lh.Position; lh.Position = [1-lp(3)-0.005 0.005 lp(3) lp(4)]; hold off; ```
``` Max Line search Directional First-order Iter F-count f(x) constraint steplength derivative optimality Procedure 0 3 -19.2577 -2.5 1 12 -19.6222 -2.617 0.0156 -2.51e+08 1.59e+09 2 15 -53.5746 0 1 -5.47e+08 4.63e+08 Hessian modified twice Local minimum found that satisfies the constraints. Optimization completed because the objective function is non-decreasing in feasible directions, to within the default value of the function tolerance, and constraints are satisfied to within the default value of the constraint tolerance. Active inequalities (to within options.TolCon = 1e-06): lower upper ineqlin ineqnonlin 1 2 Xop = -5 5 Fop = -53.5746 ```

Run PATTERNSEARCH

We will now use PATTERNSEARCH from the Global Optimization Toolbox. Pattern search optimization techniques are a class of direct search methods for optimization. A pattern search algorithm does not require any derivative information of the objective function to find an optimal point.

```PSoptions = psoptimset('Display','iter'); [Xps,Fps] = patternsearch(Objfcn,X0,[],[],[],[],LB,UB,PSoptions) figure(fig); hold on; ph(3) = plot3(Xps(1),Xps(2),Fps,'dc','MarkerSize',10,'MarkerFaceColor','c'); % Add legend to plot legendLabels = [legendLabels, 'Pattern Search solution']; lh = legend(ph,legendLabels,'Location','SouthEast'); lp = lh.Position; lh.Position = [1-lp(3)-0.005 0.005 lp(3) lp(4)]; hold off ```
``` Iter f-count f(x) MeshSize Method 0 1 -9.82132 1 1 3 -45.8992 2 Successful Poll 2 3 -45.8992 1 Refine Mesh 3 5 -45.8992 0.5 Refine Mesh 4 8 -88.4064 1 Successful Poll 5 10 -88.4064 0.5 Refine Mesh 6 13 -136.777 1 Successful Poll 7 15 -136.777 0.5 Refine Mesh 8 17 -136.777 0.25 Refine Mesh 9 20 -190.615 0.5 Successful Poll 10 22 -190.615 0.25 Refine Mesh 11 24 -190.615 0.125 Refine Mesh 12 27 -233.545 0.25 Successful Poll 13 29 -241.605 0.5 Successful Poll 14 31 -241.605 0.25 Refine Mesh 15 33 -250.17 0.5 Successful Poll 16 35 -250.17 0.25 Refine Mesh 17 37 -250.17 0.125 Refine Mesh 18 39 -250.17 0.0625 Refine Mesh 19 41 -250.17 0.03125 Refine Mesh 20 43 -250.17 0.01562 Refine Mesh 21 44 -254.925 0.03125 Successful Poll 22 46 -254.925 0.01562 Refine Mesh 23 47 -256.57 0.03125 Successful Poll 24 49 -261.07 0.0625 Successful Poll 25 51 -261.07 0.03125 Refine Mesh 26 53 -261.07 0.01562 Refine Mesh 27 55 -261.07 0.007812 Refine Mesh 28 57 -261.07 0.003906 Refine Mesh 29 59 -261.07 0.001953 Refine Mesh 30 61 -261.07 0.0009766 Refine Mesh Iter f-count f(x) MeshSize Method 31 63 -261.07 0.0004883 Refine Mesh 32 65 -261.07 0.0002441 Refine Mesh 33 66 -262.065 0.0004883 Successful Poll 34 68 -262.065 0.0002441 Refine Mesh 35 71 -262.065 0.0001221 Refine Mesh 36 74 -262.065 6.104e-05 Refine Mesh 37 77 -262.065 3.052e-05 Refine Mesh 38 80 -262.065 1.526e-05 Refine Mesh 39 83 -262.065 7.629e-06 Refine Mesh 40 86 -262.065 3.815e-06 Refine Mesh 41 89 -262.065 1.907e-06 Refine Mesh 42 92 -262.065 9.537e-07 Refine Mesh Optimization terminated: mesh size less than options.TolMesh. Xps = -4.9998 -5.0000 Fps = -262.0651 ```

Pattern search algorithm is not affected by random noise in the objective functions. Pattern search requires only function value and not the derivatives, hence noise (of some uniform kind) may not affect it. However, pattern search requires more function evaluation to find the true minimum than derivative based algorithms, a cost for not using the derivatives.