syne_tune.optimizer.schedulers.searchers.hypertune.hypertune_searcher module

class syne_tune.optimizer.schedulers.searchers.hypertune.hypertune_searcher.HyperTuneSearcher(config_space, **kwargs)[source]

Bases: GPMultiFidelitySearcher

Implements Hyper-Tune as extension of GPMultiFidelitySearcher, see HyperTuneIndependentGPModel for references. Two modifications:

  • New brackets are sampled from a model-based distribution \([w_k]\)

  • The acquisition function is fed with predictive means and variances from a mixture over rung level distributions, weighted by \([ heta_k]\)

It is not recommended to create HyperTuneSearcher searcher objects directly, but rather to create HyperbandScheduler objects with searcher="hypertune", and passing arguments here in search_options. This will use the appropriate functions from :mod:syne_tune.optimizer.schedulers.searchers.gp_searcher_factory to create components in a consistent way.

The following arguments of the parent class are not relevant here, and are ignored: gp_resource_kernel, resource_acq, issm_gamma_one, expdecay_normalize_inputs.

Additional arguments on top of parent class GPMultiFidelitySearcher:

  • model (str, optional) –

    Selects surrogate model (learning curve model) to be used. Choices are:

    • ”gp_multitask”: GP multi-task surrogate model

    • ”gp_independent” (default): Independent GPs for each rung level, sharing an ARD kernel

    The default is “gp_independent” (as in the Hyper-Tune paper), which is different to the default in GPMultiFidelitySearcher (which is “gp_multitask”). “gp_issm”, “gp_expdecay” not supported here.

  • hypertune_distribution_num_samples (int, optional) – Parameter for estimating the distribution, given by \([ heta_k]\). Defaults to 50


Some searchers need to obtain information from the scheduler they are used with, in order to configure themselves. This method has to be called before the searcher can be used.


scheduler (TrialScheduler) – Scheduler the searcher is used with.