syne_tune.optimizer.schedulers.searchers.sklearn package
- class syne_tune.optimizer.schedulers.searchers.sklearn.SKLearnSurrogateSearcher(config_space, metric, estimator, points_to_evaluate=None, scoring_class=None, num_initial_candidates=250, num_initial_random_choices=3, allow_duplicates=False, restrict_configurations=None, clone_from_state=False, **kwargs)[source]
Bases:
BayesianOptimizationSearcher
SKLearn Surrogate Bayesian optimization for FIFO scheduler
This searcher must be used with
FIFOScheduler
. It provides Bayesian optimization, based on a scikit-learn estimator based surrogate model.Additional arguments on top of parent class
StochasticSearcher
:- Parameters:
estimator (
SKLearnEstimator
) – Instance ofSKLearnEstimator
to be used as surrogate modelscoring_class (
Optional
[Callable
[[Any
],ScoringFunction
]]) – The scoring function (or acquisition function) class and any extra parameters used to instantiate it. IfNone
, expected improvement (EI) is used. Note that the acquisition function is not locally optimized with this searcher.num_initial_candidates (
int
) – Number of candidates sampled for scoring with acquisition function.num_initial_random_choices (
int
) – Number of randomly chosen candidates before surrogate model is used.allow_duplicates (
bool
) – IfTrue
, allow for the same candidate to be selected more than once.restrict_configurations (
Optional
[List
[Dict
[str
,Any
]]]) – If given, the searcher only suggests configurations from this list. Ifallow_duplicates == False
, entries are popped off this list once suggested.
- clone_from_state(state)[source]
Together with
get_state()
, this is needed in order to store and re-create the mutable state of the searcher.Given state as returned by
get_state()
, this method combines the non-pickle-able part of the immutable state from self with state and returns the corresponding searcher clone. Afterwards,self
is not used anymore.- Parameters:
state – See above
- Returns:
New searcher object