|On this page…|
You can set several options for the optimization. These options include the optimization methods and the tolerances the methods use.
To set optimization options, click Options in the Design Optimization tool. A window opens. Select the Optimization Options tab.
Both the Method and Algorithm options in the Optimization method area define the optimization method.
The choices for the Method option are:
Gradient descent (default) — Uses the Optimization Toolbox™ function fmincon to optimize the response signal subject to the constraints.
The Algorithm options for Gradient descent are:
|Algorithm Option||Learn More|
|Sequential Quadratic Programming (default)||fmincon SQP Algorithm in the Optimization Toolbox documentation.|
|Active-Set||fmincon Active Set Algorithm in the Optimization Toolbox documentation.|
|Interior-Point||fmincon Interior Point Algorithm in the Optimization Toolbox documentation.|
|Trust-Region-Reflective||fmincon Trust Region Reflective Algorithm in the Optimization Toolbox documentation.|
Pattern search — Uses the Global Optimization Toolbox function patternsearch, an advanced direct search method, to optimize the response. This option requires the Global Optimization Toolbox.
Simplex search — Uses the Optimization Toolbox function fminsearch, a direct search method, to optimize the response. Simplex search is most useful for simple problems and is sometimes faster than Gradient descent for models that contain discontinuities.
For more information on the problem formulations for each optimization method, see How the Optimization Algorithm Formulates Minimization Problems.
Use the Optimization options panel to specify when you want the optimization to terminate.
Parameter tolerance: The optimization terminates when successive parameter values change by less than this number. For more details, refer to the discussion of the parameter TolX in the reference page for the Optimization Toolbox function fmincon.
Constraint tolerance: This number determines the maximum limit by which the constraints can be violated, and still allow a successful convergence.
Function tolerance: The optimization terminates when successive function values are less than this value. Changing the default Function tolerance value is only useful when you are tracking a reference signal or using the Simplex search method. For more details, refer to the discussion of the parameter TolFun in the reference page for the Optimization Toolbox function fmincon.
Maximum iterations: The maximum number of iterations allowed. The optimization terminates when the number of iterations exceeds this number.
Look for maximally feasible solution: When selected, the optimization continues after it has found an initial, feasible solution, until it finds a maximally feasible, optimal solution. When this option is unselected, the optimization terminates as soon as it finds a solution that satisfies the constraints and the resulting response signal sometimes lies very close to the constraint segment. In contrast, a maximally feasible solution is typically located further inside the constraint region.
By varying these parameters you can force the optimization to continue searching for a solution or to continue searching for a more accurate solution.
At the bottom of the Optimization Options panel is a group of additional optimization options.
The Display level option specifies the form of the output that appears in the Optimization Progress window. The options are:
Iterations (default) — Displays information after each iteration
Off — Turns off all output display
Notify — Displays output only if the function does not converge
Final — Displays only the final output
For more information on the type of iterative output that appears for the method you selected in Method, see the discussion of output for the corresponding function.
|Gradient descent||fmincon||fmincon section of Function-Specific Headings in the Optimization Toolbox documentation|
|Simplex search||fminsearch||fminsearch section of Function-Specific Headings in the Optimization Toolbox documentation|
|Pattern search||patternsearch||Display to Command Window Options in the Global Optimization Toolbox documentation|
In some optimizations, the Hessian may become ill-conditioned and the optimization does not converge. In these cases, it is sometimes useful to restart the optimization after it stops, using the endpoint of the previous optimization as the starting point for the next one. To automatically restart the optimization, indicate the number of times you want to restart in this field.