public class NonLinearConjugateGradientOptimizer extends AbstractScalarDifferentiableOptimizer
This class supports both the Fletcher-Reeves and the Polak-Ribière update formulas for the conjugate search directions. It also supports optional preconditioning.
checker, DEFAULT_MAX_ITERATIONS, goal, point
Constructor and Description |
---|
NonLinearConjugateGradientOptimizer(ConjugateGradientFormula updateFormula)
Simple constructor with default settings.
|
Modifier and Type | Method and Description |
---|---|
protected RealPointValuePair |
doOptimize()
Perform the bulk of optimization algorithm.
|
void |
setInitialStep(double initialStep)
Set the initial step used to bracket the optimum in line search.
|
void |
setLineSearchSolver(UnivariateRealSolver lineSearchSolver)
Set the solver to use during line search.
|
void |
setPreconditioner(Preconditioner preconditioner)
Set the preconditioner.
|
computeObjectiveGradient, computeObjectiveValue, getConvergenceChecker, getEvaluations, getGradientEvaluations, getIterations, getMaxEvaluations, getMaxIterations, incrementIterationsCounter, optimize, setConvergenceChecker, setMaxEvaluations, setMaxIterations
public NonLinearConjugateGradientOptimizer(ConjugateGradientFormula updateFormula)
The convergence check is set to a SimpleVectorialValueChecker
and the maximal number of iterations is set to
AbstractScalarDifferentiableOptimizer.DEFAULT_MAX_ITERATIONS
.
updateFormula
- formula to use for updating the β parameter,
must be one of ConjugateGradientFormula.FLETCHER_REEVES
or ConjugateGradientFormula.POLAK_RIBIERE
public void setPreconditioner(Preconditioner preconditioner)
preconditioner
- preconditioner to use for next optimization,
may be null to remove an already registered preconditionerpublic void setLineSearchSolver(UnivariateRealSolver lineSearchSolver)
lineSearchSolver
- solver to use during line search, may be null
to remove an already registered solver and fall back to the
default Brent solver
.public void setInitialStep(double initialStep)
The initial step is a factor with respect to the search direction, which itself is roughly related to the gradient of the function
initialStep
- initial step used to bracket the optimum in line search,
if a non-positive value is used, the initial step is reset to its
default value of 1.0protected RealPointValuePair doOptimize() throws FunctionEvaluationException, OptimizationException, IllegalArgumentException
doOptimize
in class AbstractScalarDifferentiableOptimizer
FunctionEvaluationException
- if the objective function throws one during
the searchOptimizationException
- if the algorithm failed to convergeIllegalArgumentException
- if the start point dimension is wrongCopyright © 2003–2019. All rights reserved.