neurotools.stats.minimize module
Helper functions for minimization and optimization
- exception neurotools.stats.minimize.FailureError[source]
Bases:
RuntimeError
Re-named RuntimeError to distinguish error conditions arising from failure of optimization routines.
- neurotools.stats.minimize.minimize_retry(objective, initial, jac=None, hess=None, verbose=False, printerrors=True, failthrough=True, tol=1e-05, simplex_only=False, show_progress=False, dontuse={}, maxfeval=None, maxgeval=None, **kwargs)[source]
Call scipy.optimize.minimize, retrying a few times in case one solver doesn’t work.
This addresses unresolved bugs that can cause exceptions in some of the gradient-based solvers in Scipy. If we happen upon these bugs, we can continue optimization using slower but more robused methods.
Ultimately, this routine falls-back to the gradient-free Nelder-Mead simplex algorithm, although it will try to use faster routines if the hessian and gradient are providede.
- Parameters:
objective (objective, passed to scipy.minimize)
initial (initial parameter guess)
jac ((optional) jacobian, passed to scipy.minimize)
hess ((optional) Hessian, passed to scipy.minimize)
verbose (print extra information)
failthrough (return best params found so far, even if minimization fails)
tol (convergence tolerance, passed to scipy.minimize)
simple_only (Force it to use only the Nelder-Mead simplex optimizer)
show_progress (Print status updates during minimization)
dontuse (Set of methods not to try)
maxfeval (Maximum number of objective function evaluations to allow)
maxgeval (Maximum number of gradient function evaluations to allow)