
From the NLPQL algorithm suite, the NLPQLP implementation is used in optiSLang. This version of NLPQL is specifically tuned to run under distributed systems.
Further information about methods of multidisciplinary optimization used in optiSLang can be found here.
Initialization Options
To access the options shown in the following table, double-click the NLPQL system on the Scenery pane and switch to the NLPQL tab.
Option | Description | |||
---|---|---|---|---|
Accuracy | ||||
Desired accuracy | Tolerance by which the Karush-Kuhn-Tucker optimality conditions are considered to be satisfied. For example, if the given tolerance is smaller than the accuracy of function values and gradients then NLPQL may not or may slowly converge. Check if the given tolerance is sufficiently small compared with the initial gradients. | |||
Differentiation scheme | Method for computing numerical gradients. The higher the order of accuracy, the more accurate is the approximation of the numerical derivatives. On the other hand, the higher order derivatives may lead to a less robust iteration due to large noise and/or discontinuities. | |||
Differentiation step size | Size of the differential interval is given by a relative value that denotes the interval length related to the bounds in percent. With decreasing differentiation step size, you generally obtain a more accurate approximation of the gradient. | |||
Computational aspects | ||||
Maximum number of solver runs | Maximum allowed number of solver runs. Once the number of solver runs reaches this number, the iteration will be terminated. | |||
Number of parallel line searches | Number L of parallel solver runs in line search. Set L=1 if a classical iterative line search is requested. L>1 means parallel line search. |
Additional Options
This algorithm allows Additional Options.