syne_tune.optimizer.schedulers.searchers.sklearn package

class syne_tune.optimizer.schedulers.searchers.sklearn.SKLearnSurrogateSearcher(config_space, metric, estimator, points_to_evaluate=None, scoring_class=None, num_initial_candidates=250, num_initial_random_choices=3, allow_duplicates=False, restrict_configurations=None, clone_from_state=False, **kwargs)[source]

Bases: BayesianOptimizationSearcher

SKLearn Surrogate Bayesian optimization for FIFO scheduler

This searcher must be used with FIFOScheduler. It provides Bayesian optimization, based on a scikit-learn estimator based surrogate model.

Additional arguments on top of parent class StochasticSearcher:

Parameters:
  • estimator (SKLearnEstimator) – Instance of SKLearnEstimator to be used as surrogate model

  • scoring_class (Optional[Callable[[Any], ScoringFunction]]) – The scoring function (or acquisition function) class and any extra parameters used to instantiate it. If None, expected improvement (EI) is used. Note that the acquisition function is not locally optimized with this searcher.

  • num_initial_candidates (int) – Number of candidates sampled for scoring with acquisition function.

  • num_initial_random_choices (int) – Number of randomly chosen candidates before surrogate model is used.

  • allow_duplicates (bool) – If True, allow for the same candidate to be selected more than once.

  • restrict_configurations (Optional[List[Dict[str, Any]]]) – If given, the searcher only suggests configurations from this list. If allow_duplicates == False, entries are popped off this list once suggested.

clone_from_state(state)[source]

Together with get_state(), this is needed in order to store and re-create the mutable state of the searcher.

Given state as returned by get_state(), this method combines the non-pickle-able part of the immutable state from self with state and returns the corresponding searcher clone. Afterwards, self is not used anymore.

Parameters:

state – See above

Returns:

New searcher object

Submodules