syne_tune.optimizer.schedulers.searchers.bore package
- class syne_tune.optimizer.schedulers.searchers.bore.Bore(config_space, metric, points_to_evaluate=None, allow_duplicates=None, restrict_configurations=None, mode=None, gamma=None, calibrate=None, classifier=None, acq_optimizer=None, feval_acq=None, random_prob=None, init_random=None, classifier_kwargs=None, **kwargs)[source]
Bases:
StochasticAndFilterDuplicatesSearcher
Implements “Bayesian optimization by Density Ratio Estimation” as described in the following paper:
BORE: Bayesian Optimization by Density-Ratio Estimation,Tiao, Louis C and Klein, Aaron and Seeger, Matthias W and Bonilla, Edwin V. and Archambeau, Cedric and Ramos, FabioProceedings of the 38th International Conference on Machine LearningAdditional arguments on top of parent class
StochasticAndFilterDuplicatesSearcher
:- Parameters:
mode (
Optional
[str
]) – Can be “min” (default) or “max”.gamma (
Optional
[float
]) – Defines the percentile, i.e how many percent of configurations are used to model \(l(x)\). Defaults to 0.25calibrate (
Optional
[bool
]) – If set to true, we calibrate the predictions of the classifier via CV. Defaults to Falseclassifier (
Optional
[str
]) – The binary classifier to model the acquisition function. Choices:{"mlp", "gp", "xgboost", "rf", "logreg"}
. Defaults to “xgboost”acq_optimizer (
Optional
[str
]) – The optimization method to maximize the acquisition function. Choices:{"de", "rs", "rs_with_replacement"}
. Defaults to “rs”feval_acq (
Optional
[int
]) – Maximum allowed function evaluations of the acquisition function. Defaults to 500random_prob (
Optional
[float
]) – probability for returning a random configurations (epsilon greedy). Defaults to 0init_random (
Optional
[int
]) –get_config()
returns randomly drawn configurations until at leastinit_random
observations have been recorded inupdate()
. After that, the BORE algorithm is used. Defaults to 6classifier_kwargs (
Optional
[dict
]) – Parameters for classifier. Optional
- configure_scheduler(scheduler)[source]
Some searchers need to obtain information from the scheduler they are used with, in order to configure themselves. This method has to be called before the searcher can be used.
- Parameters:
scheduler (
TrialScheduler
) – Scheduler the searcher is used with.
- clone_from_state(state)[source]
Together with
get_state()
, this is needed in order to store and re-create the mutable state of the searcher.Given state as returned by
get_state()
, this method combines the non-pickle-able part of the immutable state from self with state and returns the corresponding searcher clone. Afterwards,self
is not used anymore.- Parameters:
state (
Dict
[str
,Any
]) – See above- Returns:
New searcher object
- class syne_tune.optimizer.schedulers.searchers.bore.MultiFidelityBore(config_space, metric, points_to_evaluate=None, allow_duplicates=None, mode=None, gamma=None, calibrate=None, classifier=None, acq_optimizer=None, feval_acq=None, random_prob=None, init_random=None, classifier_kwargs=None, resource_attr='epoch', **kwargs)[source]
Bases:
Bore
Adapts BORE (Tiao et al.) for the multi-fidelity Hyperband setting following BOHB (Falkner et al.). Once we collected enough data points on the smallest resource level, we fit a probabilistic classifier and sample from it until we have a sufficient amount of data points for the next higher resource level. We then refit the classifier on the data of this resource level. These steps are iterated until we reach the highest resource level. References:
BORE: Bayesian Optimization by Density-Ratio Estimation,Tiao, Louis C and Klein, Aaron and Seeger, Matthias W and Bonilla, Edwin V. and Archambeau, Cedric and Ramos, FabioProceedings of the 38th International Conference on Machine Learningand
BOHB: Robust and Efficient Hyperparameter Optimization at ScaleS. Falkner and A. Klein and F. HutterProceedings of the 35th International Conference on Machine LearningAdditional arguments on top of parent class
Bore
:- Parameters:
resource_attr (
str
) – Name of resource attribute. Defaults to “epoch”
Submodules
- syne_tune.optimizer.schedulers.searchers.bore.bore module
- syne_tune.optimizer.schedulers.searchers.bore.de module
- syne_tune.optimizer.schedulers.searchers.bore.gp_classififer module
- syne_tune.optimizer.schedulers.searchers.bore.mlp_classififer module
- syne_tune.optimizer.schedulers.searchers.bore.multi_fidelity_bore module