Mercurial > hg > octave-lyh
diff scripts/optimization/optimset.m @ 13797:5289d7c2460d stable
optimset.m: Document valid parameters for optimization functions.
* optimset.m: Document valid parameters for optimization functions.
author | Carnë Draug <carandraug+dev@gmail.com> |
---|---|
date | Wed, 02 Nov 2011 20:50:20 -0700 |
parents | fd0a3ac60b0e |
children | 519390f1b67f |
line wrap: on
line diff
--- a/scripts/optimization/optimset.m +++ b/scripts/optimization/optimset.m @@ -23,6 +23,57 @@ ## @deftypefnx {Function File} {} optimset (@var{old}, @var{par}, @var{val}, @dots{}) ## @deftypefnx {Function File} {} optimset (@var{old}, @var{new}) ## Create options struct for optimization functions. +## +## Valid parameters are: +## @itemize @bullet +## @item AutoScaling +## +## @item ComplexEqn +## +## @item FinDiffType +## +## @item FunValCheck +## When enabled, display an error if the objective function returns a complex +## value or NaN@. Must be set to "on" or "off" [default]. +## +## @item GradObj +## When set to "on", the function to be minimized must return a second argument +## which is the gradient, or first derivative, of the function at the point +## @var{x}. If set to "off" [default], the gradient is computed via finite +## differences. +## +## @item Jacobian +## When set to "on", the function to be minimized must return a second argument +## which is the Jacobian, or first derivative, of the function at the point +## @var{x}. If set to "off" [default], the Jacobian is computed via finite +## differences. +## +## @item MaxFunEvals +## Maximum number of function evaluations before optimization stops. +## Must be a positive integer. +## +## @item MaxIter +## Maximum number of algorithm iterations before optimization stops. +## Must be a positive integer. +## +## @item OutputFcn +## A user-defined function executed once per algorithm iteration. +## +## @item TolFun +## Termination criterion for the function output. If the difference in the +## calculated objective function between one algorithm iteration and the next +## is less than @code{TolFun} the optimization stops. Must be a positive +## scalar. +## +## @item TolX +## Termination criterion for the function input. If the difference in @var{x}, +## the current search point, between one algorithm iteration and the next is +## less than @code{TolX} the optimization stops. Must be a positive scalar. +## +## @item TypicalX +## +## @item Updating +## @end itemize ## @end deftypefn function retval = optimset (varargin)