Solves the unconstrained minimization problem for an arbitrary nonlinear function. You must manually select the polymorphic instance to use.


icon

For functions that are smooth and have first and second derivatives defined, the Broyden Quasi-Newton algorithm typically converges the fastest. If you experience problems with convergence of the Broyden Quasi-Newton algorithm, the Conjugate Gradient algorithm might be able to solve the problem. The Downhill Simplex algorithm relies only on function evaluations and often is able to find a solution when the function is not smooth and the other algorithms fail to converge.

Examples

Refer to the following example files included with LabVIEW.

  • labview\examples\Mathematics\Optimization\Optimize Extended Rosenbrock.vi