syne_tune.optimizer.schedulers.transfer_learning.zero_shot module

class syne_tune.optimizer.schedulers.transfer_learning.zero_shot.ZeroShotTransfer(config_space, metric, transfer_learning_evaluations, mode='min', sort_transfer_learning_evaluations=True, use_surrogates=False, **kwargs)[source]

Bases: TransferLearningMixin, StochasticSearcher

A zero-shot transfer hyperparameter optimization method which jointly selects configurations that minimize the average rank obtained on historic metadata (transfer_learning_evaluations). This is a searcher which can be used with FIFOScheduler. Reference:

Sequential Model-Free Hyperparameter Tuning.
Martin Wistuba, Nicolas Schilling, Lars Schmidt-Thieme.
IEEE International Conference on Data Mining (ICDM) 2015.

Additional arguments on top of parent class StochasticSearcher:

Parameters:
  • transfer_learning_evaluations (Dict[str, TransferLearningTaskEvaluations]) – Dictionary from task name to offline evaluations.

  • mode (str) – Whether to minimize (“min”, default) or maximize (“max”)

  • sort_transfer_learning_evaluations (bool) – Use False if the hyperparameters for each task in transfer_learning_evaluations are already in the same order. If set to True, hyperparameters are sorted. Defaults to True

  • use_surrogates (bool) – If the same configuration is not evaluated on all tasks, set this to True. This will generate a set of configurations and will impute their performance using surrogate models. Defaults to False

get_config(**kwargs)[source]

Suggest a new configuration.

Note: Query _next_initial_config() for initial configs to return first.

Parameters:

kwargs – Extra information may be passed from scheduler to searcher

Return type:

Optional[dict]

Returns:

New configuration. The searcher may return None if a new configuration cannot be suggested. In this case, the tuning will stop. This happens if searchers never suggest the same config more than once, and all configs in the (finite) search space are exhausted.