Skip to content

Add wrappers for Nevergrad #560

@janosg

Description

@janosg

Wrap the gradient free optimizers from nevergrad in optimagic.

Nevergrad implements the following algorithms:

  • NgIohTuned is "meta"-optimizer which adapts to the provided settings (budget, number of workers, parametrization) and should therefore be a good default.
  • TwoPointsDE is excellent in many cases, including very high num_workers.
  • PortfolioDiscreteOnePlusOne is excellent in discrete settings of mixed settings when high precision on parameters is not relevant; it's possibly a good choice for hyperparameter choice.
  • OnePlusOne is a simple robust method for continuous parameters with num_workers < 8.
  • CMA is excellent for control (e.g. neurocontrol) when the environment is not very noisy (num_workers ~50 ok) and when the budget is large (e.g. 1000 x the dimension).
  • TBPSA is excellent for problems corrupted by noise, in particular overparameterized (neural) ones; very high num_workers ok).
  • PSO is excellent in terms of robustness, high num_workers ok.
  • ScrHammersleySearchPlusMiddlePoint is excellent for super parallel cases (fully one-shot, i.e. num_workers = budget included) or for very multimodal cases (such as some of our MLDA problems); don't use softmax with this optimizer.
  • RandomSearch is the classical random search baseline; don't use softmax with this optimizer.

In the long run we want to wrap all of them, but if you are tackling this as your first issue you should focus on one. In that case please coment below which optimizer you are going to work on so we don't duplicate efforts.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions