syne_tune.optimizer.schedulers.searchers.kde.multi_fidelity_kde_searcher module

class syne_tune.optimizer.schedulers.searchers.kde.multi_fidelity_kde_searcher.MultiFidelityKernelDensityEstimator(config_space, metric, points_to_evaluate=None, allow_duplicates=None, mode=None, num_min_data_points=None, top_n_percent=None, min_bandwidth=None, num_candidates=None, bandwidth_factor=None, random_fraction=None, resource_attr=None, **kwargs)[source]

Bases: KernelDensityEstimator

Adapts KernelDensityEstimator to the multi-fidelity setting as proposed by Falkner et al such that we can use it with Hyperband. Following Falkner et al, we fit the KDE only on the highest resource level where we have at least num_min_data_points. Code is based on the implementation by Falkner et al: https://github.com/automl/HpBandSter/tree/master/hpbandster

BOHB: Robust and Efficient Hyperparameter Optimization at Scale
S. Falkner and A. Klein and F. Hutter
Proceedings of the 35th International Conference on Machine Learning

Additional arguments on top of parent class KernelDensityEstimator:

Parameters:

resource_attr (Optional[str]) – Name of resource attribute. Defaults to scheduler.resource_attr in configure_scheduler()

configure_scheduler(scheduler)[source]

Some searchers need to obtain information from the scheduler they are used with, in order to configure themselves. This method has to be called before the searcher can be used.

Parameters:

scheduler (TrialScheduler) – Scheduler the searcher is used with.