## Documentation |

On this page… |
---|

When you perform parameter estimation, the software formulates an optimization problem. The optimization problem solution is the estimated parameter values set. This optimization problem consists of:

*x*—*Design variables*. The model parameters and initial states to be estimated.*F*(*x*) —*Objective function*. A function that calculates a measure of the difference between the simulated and measured responses. Also called*cost function*or*estimation error*.(Optional) $$\underset{\xaf}{x}\le x\le \overline{x}$$ —

*Bounds*. Limits on the estimated parameter values.(Optional)

*C*(*x*) —*Constraint function*. A function that specifies restrictions on the design variables.

The optimization solver tunes the values of the design variables to satisfy the specified objectives and constraints. The exact formulation of the optimization depends on the optimization method that you use.

The software tunes the model parameters to obtain a simulated
response (*y _{sim}*) that tracks
the measured response or reference signal (

The raw estimation error, *e*(*t*),
is defined as:

$$e(t)={y}_{ref}(t)-{y}_{sim}(t)$$

*e*(*t*) is also referred
to as the *error residuals* or, simply, *residuals*.

Simulink^{®} Design Optimization™ software provides you the following
cost functions to process *e*(*t*):

Cost Function | Formulation | Option Name in GUI or Command Line |
---|---|---|

Sum squared error (default) | $$F(x)={\displaystyle \sum _{t=0}^{{t}_{N}}e}(t)\times e(t)$$
| 'SSE' |

Sum absolute error | $$F(x)={\displaystyle \sum _{t=0}^{{t}_{N}}|}e(t)|$$
| 'SAE' |

Raw error | $$F(x)=\left[\begin{array}{c}e(0)\\ \vdots \\ e(N)\end{array}\right]$$
| 'Residuals'This option is available only at the command line. |

Custom function | N/A | This option is available only at the command line. |

The software evaluates the cost function for a specific time
interval. This interval is dependent on the *measured signal
time base* and the *simulated signal time base*.

The measured signal time base consists of all the time points for which the measured signal is specified. In case of multiple measured signals, this time base is the union of the time points of all the measured signals.

The simulated signal time base consists of all the time points for which the model is simulated.

If the model uses a variable-step solver, then the simulated signal time base can change from one optimization iteration to another. The simulated and measured signal time bases can be different. The software evaluates the cost function for only the time interval that is common to both. By default, the software uses only the time points specified by the measured signal in the common time interval.

In the GUI, you can specify the simulation start and stop times in the

**Simulation time**area of the**Simulation Options**dialog box.At the command line, the software specifies the simulation stop time as the last point of the measured signal time base. For example, the following code simulates the model until the end time of the longest running output signal of

`exp`, an`sdo.Experiment`object:sim_obj = createSimulator(exp); sim_obj = sim(sim_obj);

`sim_obj`contains the simulated response for the model associated with`exp`.

You can specify bounds for the design variables (estimated model parameters), based on your knowledge of the system. Bounds are expressed as:

$$\underset{\xaf}{x}\le x\le \overline{x}$$

$$\underset{\xaf}{x}$$ and $$\overline{x}$$ are the lower and upper bounds for the design variables.

For example, in a battery discharging experiment, the estimated
battery initial charge must be greater than zero and less than `Inf`.
These bounds are expressed as:

$$0<x<\infty $$

For an example of how to specify these types of bounds, see Estimate Model Parameters and Initial States (Code).

You can also specify other constraints, *C*(*x*),
on the design variables at the command line. *C*(*x*)
can be linear or nonlinear and can describe equalities or inequalities. *C*(*x*)
can also specify multiparameter constraints. For example, for a simple
friction model, *C*(*x*) can specify
that the static friction coefficient must be greater than or equal
to the dynamic friction coefficient. One way of expressing this constraint
is:

$$\begin{array}{l}C(x):{x}_{1}-{x}_{2}\\ C(x)\le 0\end{array}$$

*x*_{1} and *x*_{2} are
the dynamic and static friction coefficients, respectively.

For an example of how to specify a constraint, see Estimate Model Parameters with Parameter Constraints (Code).

An optimization problem can be one of the following types:

Minimization problem — Minimizes an objective function,

*F*(*x*). You specify the measured signal that you want the model output to track. You can optionally specify bounds for the estimated parameters.Mixed minimization and feasibility problem — Minimizes an objective function,

*F*(*x*), subject to specified bounds and constraints,*C*(*x*). You specify the measured signal that you want the model to track and bounds and constraints for the estimated parameters.Feasibility problem — Finds a solution that satisfies the specified constraints,

*C*(*x*). You specify only bounds and constraints for the estimated parameters. This type of problem is not common in parameter estimation.

The optimization method that you specify determines the formulation of the estimation problem. The software provides the following optimization methods:

Optimization Method Name | Description | Optimization Problem Formulation |
---|---|---|

User interface: **Nonlinear Least Squares**Command line: `'lsqnonlin'`
| Minimizes the squares of the residuals, recommended method for parameter estimation. This method requires a vector of error residuals, computed using a fixed time base. Do not use this approach if you have a scalar cost function or if the number of error residuals can change from one iteration to another. This
method uses the Optimization Toolbox™ function, | |

User interface: **Gradient Descent**Command line: `'fmincon'`
| General nonlinear solver, uses the cost function gradient. Use this approach if you want to specify one or any combination of the following: Custom cost functions Parameter-based constraints Signal-based constraints
This method uses the Optimization Toolbox function, For information on how the gradient is computed, see Gradient Computations. | |

User interface: **Simplex Search**Command line: `'fminsearch'`
| Based on the Nelder-Mead algorithm, this approach does not use the cost function gradient. Use this approach if your cost function or constraints are not continuous or differentiable. This
method uses the Optimization Toolbox functions, | |

User interface: **Pattern Search**Command line: `'patternsearch'`
| Direct search method, based on the generalized pattern search algorithm, this method does not use the cost function gradient. Use this approach if your cost function or constraints are not continuous or differentiable. This method uses the Global Optimization Toolbox function, |

`fminbnd` | `fmincon` | `fminsearch` | `lsqnonlin` | `patternsearch` | `sdo.Experiment` | `sdo.requirements.SignalTracking` | `sdo.requirements.SignalTracking` | `sdo.SimulationTest`

- Estimate Model Parameter Values (Code)
- Estimate Model Parameters with Parameter Constraints (Code)
- Estimate Parameters from Measured Data

Was this topic helpful?