comparison scripts/optimization/fminunc.m @ 20375:f1d0f506ee78 stable

doc: Update more docstrings to have one sentence summary as first line. Reviewed optimization, polynomial, signal script directories. * scripts/optimization/fminbnd.m, scripts/optimization/fminsearch.m, scripts/optimization/fminunc.m, scripts/optimization/fsolve.m, scripts/optimization/fzero.m, scripts/optimization/glpk.m, scripts/optimization/lsqnonneg.m, scripts/optimization/pqpnonneg.m, scripts/optimization/qp.m, scripts/optimization/sqp.m, scripts/polynomial/compan.m, scripts/polynomial/mkpp.m, scripts/polynomial/mpoles.m, scripts/polynomial/pchip.m, scripts/polynomial/poly.m, scripts/polynomial/polyaffine.m, scripts/polynomial/polyder.m, scripts/polynomial/polyeig.m, scripts/polynomial/polyfit.m, scripts/polynomial/polygcd.m, scripts/polynomial/polyint.m, scripts/polynomial/polyout.m, scripts/polynomial/polyval.m, scripts/polynomial/ppder.m, scripts/polynomial/ppint.m, scripts/polynomial/ppjumps.m, scripts/polynomial/ppval.m, scripts/polynomial/residue.m, scripts/polynomial/roots.m, scripts/polynomial/spline.m, scripts/polynomial/splinefit.m, scripts/polynomial/unmkpp.m, scripts/signal/arch_fit.m, scripts/signal/arch_rnd.m, scripts/signal/arma_rnd.m, scripts/signal/autoreg_matrix.m, scripts/signal/bartlett.m, scripts/signal/blackman.m, scripts/signal/detrend.m, scripts/signal/diffpara.m, scripts/signal/durbinlevinson.m, scripts/signal/fftconv.m, scripts/signal/fftfilt.m, scripts/signal/fftshift.m, scripts/signal/filter2.m, scripts/signal/freqz.m, scripts/signal/hamming.m, scripts/signal/hanning.m, scripts/signal/hurst.m, scripts/signal/ifftshift.m, scripts/signal/periodogram.m, scripts/signal/sinc.m, scripts/signal/sinetone.m, scripts/signal/sinewave.m, scripts/signal/spectral_adf.m, scripts/signal/spectral_xdf.m, scripts/signal/spencer.m, scripts/signal/stft.m, scripts/signal/synthesis.m, scripts/signal/unwrap.m, scripts/signal/yulewalker.m: Update more docstrings to have one sentence summary as first line.
author Rik <rik@octave.org>
date Mon, 04 May 2015 21:50:57 -0700
parents 9fc020886ae9
children
comparison
equal deleted inserted replaced
20374:df437a52bcaf 20375:f1d0f506ee78
23 ## @deftypefnx {Function File} {} fminunc (@var{fcn}, @var{x0}, @var{options}) 23 ## @deftypefnx {Function File} {} fminunc (@var{fcn}, @var{x0}, @var{options})
24 ## @deftypefnx {Function File} {[@var{x}, @var{fval}, @var{info}, @var{output}, @var{grad}, @var{hess}] =} fminunc (@var{fcn}, @dots{}) 24 ## @deftypefnx {Function File} {[@var{x}, @var{fval}, @var{info}, @var{output}, @var{grad}, @var{hess}] =} fminunc (@var{fcn}, @dots{})
25 ## Solve an unconstrained optimization problem defined by the function 25 ## Solve an unconstrained optimization problem defined by the function
26 ## @var{fcn}. 26 ## @var{fcn}.
27 ## 27 ##
28 ## @var{fcn} should accept a vector (array) defining the unknown variables, 28 ## @var{fcn} should accept a vector (array) defining the unknown variables, and
29 ## and return the objective function value, optionally with gradient. 29 ## return the objective function value, optionally with gradient.
30 ## @code{fminunc} attempts to determine a vector @var{x} such that 30 ## @code{fminunc} attempts to determine a vector @var{x} such that
31 ## @code{@var{fcn} (@var{x})} is a local minimum. @var{x0} determines a 31 ## @code{@var{fcn} (@var{x})} is a local minimum.
32 ## starting guess. The shape of @var{x0} is preserved in all calls to 32 ##
33 ## @var{fcn}, but otherwise is treated as a column vector. 33 ## @var{x0} determines a starting guess. The shape of @var{x0} is preserved in
34 ## @var{options} is a structure specifying additional options. 34 ## all calls to @var{fcn}, but otherwise is treated as a column vector.
35 ## Currently, @code{fminunc} recognizes these options: 35 ##
36 ## @var{options} is a structure specifying additional options. Currently,
37 ## @code{fminunc} recognizes these options:
36 ## @qcode{"FunValCheck"}, @qcode{"OutputFcn"}, @qcode{"TolX"}, 38 ## @qcode{"FunValCheck"}, @qcode{"OutputFcn"}, @qcode{"TolX"},
37 ## @qcode{"TolFun"}, @qcode{"MaxIter"}, @qcode{"MaxFunEvals"}, 39 ## @qcode{"TolFun"}, @qcode{"MaxIter"}, @qcode{"MaxFunEvals"},
38 ## @qcode{"GradObj"}, @qcode{"FinDiffType"}, 40 ## @qcode{"GradObj"}, @qcode{"FinDiffType"}, @qcode{"TypicalX"},
39 ## @qcode{"TypicalX"}, @qcode{"AutoScaling"}. 41 ## @qcode{"AutoScaling"}.
40 ## 42 ##
41 ## If @qcode{"GradObj"} is @qcode{"on"}, it specifies that @var{fcn}, 43 ## If @qcode{"GradObj"} is @qcode{"on"}, it specifies that @var{fcn}, when
42 ## when called with 2 output arguments, also returns the Jacobian matrix 44 ## called with 2 output arguments, also returns the Jacobian matrix of partial
43 ## of partial first derivatives at the requested point. 45 ## first derivatives at the requested point. @code{TolX} specifies the
44 ## @code{TolX} specifies the termination tolerance for the unknown variables 46 ## termination tolerance for the unknown variables @var{x}, while @code{TolFun}
45 ## @var{x}, while @code{TolFun} is a tolerance for the objective function 47 ## is a tolerance for the objective function value @var{fval}. The default is
46 ## value @var{fval}. The default is @code{1e-7} for both options. 48 ## @code{1e-7} for both options.
47 ## 49 ##
48 ## For a description of the other options, see @code{optimset}. 50 ## For a description of the other options, see @code{optimset}.
49 ## 51 ##
50 ## On return, @var{x} is the location of the minimum and @var{fval} contains 52 ## On return, @var{x} is the location of the minimum and @var{fval} contains
51 ## the value of the objective function at @var{x}. @var{info} may be one of the 53 ## the value of the objective function at @var{x}.
52 ## following values: 54 ##
55 ## @var{info} may be one of the following values:
53 ## 56 ##
54 ## @table @asis 57 ## @table @asis
55 ## @item 1 58 ## @item 1
56 ## Converged to a solution point. Relative gradient error is less than 59 ## Converged to a solution point. Relative gradient error is less than
57 ## specified by @code{TolFun}. 60 ## specified by @code{TolFun}.
75 ## 78 ##
76 ## Optionally, @code{fminunc} can return a structure with convergence statistics 79 ## Optionally, @code{fminunc} can return a structure with convergence statistics
77 ## (@var{output}), the output gradient (@var{grad}) at the solution @var{x}, 80 ## (@var{output}), the output gradient (@var{grad}) at the solution @var{x},
78 ## and approximate Hessian (@var{hess}) at the solution @var{x}. 81 ## and approximate Hessian (@var{hess}) at the solution @var{x}.
79 ## 82 ##
80 ## Notes: If have only a single nonlinear equation of one variable then using 83 ## Application Notes: If have only a single nonlinear equation of one variable
81 ## @code{fminbnd} is usually a much better idea. The algorithm used is a 84 ## then using @code{fminbnd} is usually a better choice.
82 ## gradient search which depends on the objective function being differentiable. 85 ##
83 ## If the function has discontinuities it may be better to use a derivative-free 86 ## The algorithm used by @code{fminsearch} is a gradient search which depends
84 ## algorithm such as @code{fminsearch}. 87 ## on the objective function being differentiable. If the function has
88 ## discontinuities it may be better to use a derivative-free algorithm such as
89 ## @code{fminsearch}.
85 ## @seealso{fminbnd, fminsearch, optimset} 90 ## @seealso{fminbnd, fminsearch, optimset}
86 ## @end deftypefn 91 ## @end deftypefn
87 92
88 ## PKG_ADD: ## Discard result to avoid polluting workspace with ans at startup. 93 ## PKG_ADD: ## Discard result to avoid polluting workspace with ans at startup.
89 ## PKG_ADD: [~] = __all_opts__ ("fminunc"); 94 ## PKG_ADD: [~] = __all_opts__ ("fminunc");