6.2.3. Optimizers¶
For Task Optimization
, you can specify one or more Optimizer
blocks, with the Optimizer%Type
key specifying the type of optimizer, and settings for the optimizer specified in the corresponding block:
Optimizer
Type [AdaptiveRateMonteCarlo | CMAES | Nevergrad | Scipy | RandomSampling | GridSampling]
End
Optimizer
- Type:
Block
- Recurring:
True
- GUI name:
Optimization algorithm:
- Description:
An optimizer which may be used during the optimization.
Type
- Type:
Multiple Choice
- Default value:
CMAES
- Options:
[AdaptiveRateMonteCarlo, CMAES, Nevergrad, Scipy, RandomSampling, GridSampling]
- Description:
Name of the optimization algorithm or interface. Available options: • Adaptive Rate Monte Carlo • CMAES (Covariance Matrix Adaptation Evolutionary Strategy) • Nevergrad (Interface to many other algorithms) • Scipy (Interface to classic optimization algorithms) • Random Sampling (Uniform random sampling of the space for later analysis. NOT an actual optimizer) • Grid Sampling (Gridwise sampling of the parameter space for later analysis. NOT and actual optimizer)
We generally recommend using CMA-ES.
Note
Some optimizers have a special Settings
sub-block.
This is a free block in which users may write any special and infrequently needed settings that are not available as their own keys.
The pattern is to write space-separated argument/value pairs within the block, one per line.
For example:
Settings
sample_rate 30
sub_method "Gaussian"
ignore_inf True
End
Each setting value is evaluated in Python. Strings may, but need not, be enclosed in quotation marks. Consult the Python API documentation for the optimizers to see what arguments are accepted.
6.2.3.1. Adaptive Rate Monte-Carlo¶
If Optimizer%Type
is set to AdaptiveRateMonteCarlo
, this block configures the
settings for the optimizer.
Optimizer
AdaptiveRateMonteCarlo
AcceptanceTarget float
Gamma float
MoveRange float_list
Phi float
SubIterations integer
SuperIterations integer
End
End
Optimizer
- Type:
Block
- Recurring:
True
- GUI name:
Optimization algorithm:
- Description:
An optimizer which may be used during the optimization.
AdaptiveRateMonteCarlo
- Type:
Block
- GUI name:
Adaptive Rate Monte-Carlo
- Description:
Settings for the Adaptive Rate Monte-Carlo optimizer
AcceptanceTarget
- Type:
Float
- Default value:
0.25
- GUI name:
Target acceptance (αₜ)
- Description:
The target acceptance rate αₜ
Gamma
- Type:
Float
- Default value:
2.0
- GUI name:
γ:
- Description:
γ parameter.
MoveRange
- Type:
Float List
- Default value:
[0.9, 0.905, 0.91, 0.915, 0.92, 0.925, 0.93, 0.935, 0.94, 0.945, 0.95, 0.955, 0.96, 0.965, 0.97, 0.975, 0.98, 0.985, 0.99, 0.995, 1.005, 1.01, 1.015, 1.02, 1.025, 1.03, 1.035, 1.04, 1.045, 1.05, 1.055, 1.06, 1.065, 1.07, 1.075, 1.08, 1.085, 1.09, 1.095, 1.1]
- GUI name:
List of allowed move sizes:
- Description:
List of allowed move sizes.
Phi
- Type:
Float
- Default value:
1.0
- GUI name:
ϕ:
- Description:
ϕ parameter.
SubIterations
- Type:
Integer
- Default value:
100
- GUI name:
Number of sub-iterations (ω):
- Description:
Number of sub-iterations (ω)
SuperIterations
- Type:
Integer
- Default value:
1000
- GUI name:
Number of super-iterations (κ):
- Description:
Number of super-iterations (κ)
6.2.3.2. CMA-ES¶
If Optimizer%Type
is set to CMAES
, this block configures the
settings for the optimizer.
CMA-ES stands for Covariance Matrix Adaptation Evolutionary Strategy. This is the recommended
optimizer for most problems. The most important parameter to set is
Optimizer%CMAES%Sigma0
, which specifies how broadly to sample in the
beginning of the optimization, and Optimizer%CMAES%Popsize
, which specifies
how many parameter sets to sample around the current central point.
Optimizer
CMAES
ForceInjections Yes/No
InjectionInterval integer
KeepFiles Yes/No
MinSigma float
Popsize integer
Sampler [full | vd | vkd]
Settings # Non-standard block. See details.
...
End
Sigma0 float
Verbose Yes/No
End
End
Optimizer
- Type:
Block
- Recurring:
True
- GUI name:
Optimization algorithm:
- Description:
An optimizer which may be used during the optimization.
CMAES
- Type:
Block
- GUI name:
CMA-ES
- Description:
Settings for the Covariance Matrix Adaptation Evolutionary Strategy
ForceInjections
- Type:
Bool
- Default value:
No
- Description:
If Yes, injections of parameter vectors into the solver will be exact, guaranteeing that that solution will be in the next iteration’s population. If No, the injection will result in a direction relative nudge towards the vector. Forcing the injecting can limit global exploration but non-forced injections may have little effect. See also glompo.optimizer.cmaes.injectioninterval
InjectionInterval
- Type:
Integer
- Default value:
0
- Description:
Number of iterations between injections of the incumbent solution into the sampled population. Defaults to 0, i.e., no injections are performed. Injections can be helpful in increasing the convergence speed and nudging optimizers toward a solution. This is a form of elitism and will limit exploration. Of particular interest is pairing this key with glompo.ShareBestEvaluationBetweenOptimizers. In this case the incumbent that is injected comes from other optimizers run in parallel. This can help nudge optimizers towards better solutions while dramatically improving convergence speed. See also glompo.optimizer.cmaes.forceinjections
KeepFiles
- Type:
Bool
- Default value:
No
- GUI name:
Store covariance history:
- Description:
Keep CMA specific history files about the state of the covariance matrix and other algorithm variables.
MinSigma
- Type:
Float
- Description:
Convergence condition to terminate the optimization when the standard deviation of the sampling distribution falls below this value.
Popsize
- Type:
Integer
- Description:
Number of function evaluations per CMA-ES iteration. If not specified or -1, then the population size will equal the number of workers available to the optimizer. This is computationally efficient but not algorithmically efficient. A value of zero or one will set the population size to the value suggested by CMA based on the dimensionality of the problem. This produces the best algorithm performance but may be computationally inefficient if resources are left idling while waiting for other evaluations to complete. Note: a population of one is not allowed by CMA, therefore, it is changed to the algorithm default. This also means that giving CMA only one worker will also change the popsize to the algorithm default.
Sampler
- Type:
Multiple Choice
- Default value:
full
- Options:
[full, vd, vkd]
- Description:
Choice of full or restricted Gaussian sampling procedures. Options: • full: Full sampling procedure • vd: Restricted sampler for VD-CMA (Linear time/space comparison-based natural gradient optimization) • vkd: Restricted sampler for VkD-CMA (Time/space variant of CMA-ES)
Settings
- Type:
Non-standard block
- GUI name:
Extra settings:
- Description:
‘argument value’ pairs for extra CMA specific configuration arguments. See CMAOptimizer API documentation for more details on which options are possible.
Sigma0
- Type:
Float
- Default value:
0.05
- GUI name:
σ₀:
- Description:
Initial standard deviation of the multivariate normal distribution from which trials are drawn. The recommended range of values for is between 0.01 and 0.5. Lower values sample very locally and converge quickly, higher values sample broadly but will take a long time to converge.
Verbose
- Type:
Bool
- Default value:
No
- Description:
Produce a printstream of results from within the optimizer itself.
6.2.3.3. Scipy¶
If Optimizer%Type
is set to Scipy
, this block configures the
settings for the optimizer.
This ‘optimizer’ is in fact a wrapper to several classic minimization algorithms provided by the Python Scipy package.
Warning
Due to Scipy internals, the Scipy optimizers are not compatible with checkpointing.
Optimizer
Scipy
Algorithm [Nelder-Mead | Powell | CG | BFGS | Newton-CG | L-BFGS-B | TNC | COBYLA |
SLSQP | trust-constr | dogleg | trust-ncg | trust-exact | trust-krylov]
Hessian [2-point | 3-point | cs]
Jacobian [2-point | 3-point | cs]
Settings # Non-standard block. See details.
...
End
Tolerance float
End
End
Optimizer
- Type:
Block
- Recurring:
True
- GUI name:
Optimization algorithm:
- Description:
An optimizer which may be used during the optimization.
Scipy
- Type:
Block
- GUI name:
Scipy (Minimize)
- Description:
Settings for the Scipy wrapper which gives access to many optimization algorithms. For parallel optimizations, Nevergrad is preferred as it provides better control options.
Algorithm
- Type:
Multiple Choice
- Default value:
Nelder-Mead
- Options:
[Nelder-Mead, Powell, CG, BFGS, Newton-CG, L-BFGS-B, TNC, COBYLA, SLSQP, trust-constr, dogleg, trust-ncg, trust-exact, trust-krylov]
- Description:
Optimization strategy to use. See Scipy documentation for details of the available methods.
Hessian
- Type:
Multiple Choice
- Default value:
2-point
- Options:
[2-point, 3-point, cs]
- Description:
Choice of the Hessian estimation method. Only for Newton-CG, dogleg, trust-ncg, trust-krylov, trust-exact and trust-constr.
Jacobian
- Type:
Multiple Choice
- Default value:
2-point
- Options:
[2-point, 3-point, cs]
- Description:
Choice of gradient estimation method. Only for CG, BFGS, Newton-CG, L-BFGS-B, TNC, SLSQP, dogleg, trust-ncg, trust-krylov, trust-exact and trust-constr.
Settings
- Type:
Non-standard block
- GUI name:
Extra settings:
- Description:
‘argument value’ pairs for extra algorithm-specific configuration arguments. See Scipy API documentation for more details on which options are possible.
Tolerance
- Type:
Float
- Default value:
1e-06
- Description:
Tolerance for termination. Interpretation linked to chosen method. See Scipy docs.
6.2.3.4. Nevergrad¶
If the above optimizers are insufficient for your requirements, ParAMS also provides an interface to Nevergrad which is, itself, an interface to more than 85 optimization algorithms.
Note
Nevergrad is not included with AMS. You need to install it manually to use this feature: $AMSBIN/amspython -m pip install nevergrad
.
The Nevergrad integration with ParAMS (particularly with regards to checkpointing) is not well-tested due to the very large number of algorithms available.
If Optimizer%Type
is set to Nevergrad
, this block configures the
settings for the optimizer.
Optimizer
Nevergrad
Algorithm [ASCMA2PDEthird | ASCMADEQRthird | ASCMADEthird | AlmostRotationInvariantDE |
AvgMetaLogRecentering | AvgMetaRecentering | BO | CM | CMA | CMandAS |
CMandAS2 | CMandAS3 | CauchyLHSSearch | CauchyOnePlusOne | CauchyScrHammersleySearch |
Cobyla | DE | DiagonalCMA | DiscreteOnePlusOne | DoubleFastGADiscreteOnePlusOne |
EDA | ES | FCMA | HaltonSearch | HaltonSearchPlusMiddlePoint | HammersleySearch |
HammersleySearchPlusMiddlePoint | LHSSearch | LargeHaltonSearch | LhsDE |
MEDA | MPCEDA | MetaLogRecentering | MetaRecentering | MixES | MultiCMA |
MultiScaleCMA | MutDE | NGO | NaiveIsoEMNA | NaiveTBPSA | NelderMead |
NoisyBandit | NoisyDE | NoisyDiscreteOnePlusOne | NoisyOnePlusOne | OAvgMetaLogRecentering |
ORandomSearch | OScrHammersleySearch | OnePlusOne | OptimisticDiscreteOnePlusOne |
OptimisticNoisyOnePlusOne | PBIL | PCEDA | PSO | ParaPortfolio | Portfolio |
Powell | QORandomSearch | QOScrHammersleySearch | QrDE | RCobyla | RPowell |
RSQP | RandomSearch | RandomSearchPlusMiddlePoint | RealSpacePSO | RecES |
RecMixES | RecMutDE | RecombiningPortfolioOptimisticNoisyDiscreteOnePlusOne |
RotationInvariantDE | SPSA | SQP | SQPCMA | ScrHaltonSearch | ScrHaltonSearchPlusMiddlePoint |
ScrHammersleySearch | ScrHammersleySearchPlusMiddlePoint | Shiva | SplitOptimizer |
TBPSA | TripleCMA | TwoPointsDE | cGA | chainCMAPowel]
Settings # Non-standard block. See details.
...
End
Zero float
End
End
Optimizer
- Type:
Block
- Recurring:
True
- GUI name:
Optimization algorithm:
- Description:
An optimizer which may be used during the optimization.
Nevergrad
- Type:
Block
- GUI name:
Nevergrad
- Description:
Settings for the Nevergrad wrapper which gives access to many optimization algorithms.
Algorithm
- Type:
Multiple Choice
- Default value:
TBPSA
- Options:
[ASCMA2PDEthird, ASCMADEQRthird, ASCMADEthird, AlmostRotationInvariantDE, AvgMetaLogRecentering, AvgMetaRecentering, BO, CM, CMA, CMandAS, CMandAS2, CMandAS3, CauchyLHSSearch, CauchyOnePlusOne, CauchyScrHammersleySearch, Cobyla, DE, DiagonalCMA, DiscreteOnePlusOne, DoubleFastGADiscreteOnePlusOne, EDA, ES, FCMA, HaltonSearch, HaltonSearchPlusMiddlePoint, HammersleySearch, HammersleySearchPlusMiddlePoint, LHSSearch, LargeHaltonSearch, LhsDE, MEDA, MPCEDA, MetaLogRecentering, MetaRecentering, MixES, MultiCMA, MultiScaleCMA, MutDE, NGO, NaiveIsoEMNA, NaiveTBPSA, NelderMead, NoisyBandit, NoisyDE, NoisyDiscreteOnePlusOne, NoisyOnePlusOne, OAvgMetaLogRecentering, ORandomSearch, OScrHammersleySearch, OnePlusOne, OptimisticDiscreteOnePlusOne, OptimisticNoisyOnePlusOne, PBIL, PCEDA, PSO, ParaPortfolio, Portfolio, Powell, QORandomSearch, QOScrHammersleySearch, QrDE, RCobyla, RPowell, RSQP, RandomSearch, RandomSearchPlusMiddlePoint, RealSpacePSO, RecES, RecMixES, RecMutDE, RecombiningPortfolioOptimisticNoisyDiscreteOnePlusOne, RotationInvariantDE, SPSA, SQP, SQPCMA, ScrHaltonSearch, ScrHaltonSearchPlusMiddlePoint, ScrHammersleySearch, ScrHammersleySearchPlusMiddlePoint, Shiva, SplitOptimizer, TBPSA, TripleCMA, TwoPointsDE, cGA, chainCMAPowel]
- Description:
Optimization strategy to use. See Nevergrad documentation for details of the available methods.
Settings
- Type:
Non-standard block
- GUI name:
Extra settings:
- Description:
‘argument value’ pairs for extra algorithm-specific configuration arguments. See Nevergrad API documentation for more details on which options are possible.
Zero
- Type:
Float
- Default value:
0.0
- GUI name:
Termination function value:
- Description:
Function value below which the algorithm will terminate.
6.2.3.5. Random sampling¶
If Optimizer%Type
is set to RandomSampling
, this block configures the
settings for the optimizer.
RandomSampling is not really an optimizer, but simply randomly samples parameters in the allowed ranges. The output from a random sampling can be used for sensitivity analysis.
Optimizer
RandomSampling
NumberOfSamples integer
RandomSeed integer
End
End
Optimizer
- Type:
Block
- Recurring:
True
- GUI name:
Optimization algorithm:
- Description:
An optimizer which may be used during the optimization.
RandomSampling
- Type:
Block
- GUI name:
Random Sampler
- Description:
Settings for a totally random sampler of the parameter space.
NumberOfSamples
- Type:
Integer
- Default value:
100
- Description:
Number of samples to generate.
RandomSeed
- Type:
Integer
- Description:
Random seed to use for the generator. Useful for reproducibility.
6.2.3.6. Grid sampling¶
If Optimizer%Type
is set to GridSampling
, this block configures the
settings for the optimizer.
GridSampling is not really an optimizer, but simply samples the available parameter space in a regular grid pattern. This can be useful in rapidly (and coarsely) exploring the parameter space and investigating the appropriateness of bounds for example.
Optimizer
GridSampling
NumberOfDivisions integer
End
End
Optimizer
- Type:
Block
- Recurring:
True
- GUI name:
Optimization algorithm:
- Description:
An optimizer which may be used during the optimization.
GridSampling
- Type:
Block
- GUI name:
Grid-wise Sampler
- Description:
Settings for grid-wise sampling of the parameter space.
NumberOfDivisions
- Type:
Integer
- Default value:
10
- GUI name:
Step in each dimension:
- Description:
Number of equal sub-divisions to take for each parameter. Total calls is n_steps^n_parameters.