syne_tune.optimizer.schedulers.searchers.searcher_base module

syne_tune.optimizer.schedulers.searchers.searcher_base.extract_random_seed(**kwargs)[source]
Return type:

(int, Dict[str, Any])

syne_tune.optimizer.schedulers.searchers.searcher_base.sample_random_configuration(hp_ranges, random_state, exclusion_list=None)[source]

Samples a configuration from config_space at random.

Parameters:
  • hp_ranges (HyperparameterRanges) – Used for sampling configurations

  • random_state (RandomState) – PRN generator

  • exclusion_list (Optional[ExclusionList]) – Configurations not to be returned

Return type:

Optional[Dict[str, Any]]

Returns:

New configuration, or None if configuration space has been exhausted

class syne_tune.optimizer.schedulers.searchers.searcher_base.StochasticSearcher(config_space, metric, points_to_evaluate=None, **kwargs)[source]

Bases: BaseSearcher

Base class of searchers which use random decisions. Creates the random_state member, which must be used for all random draws.

Making proper use of this interface allows us to run experiments with control of random seeds, e.g. for paired comparisons or integration testing.

Additional arguments on top of parent class BaseSearcher:

Parameters:
  • random_seed_generator (RandomSeedGenerator, optional) – If given, random seed is drawn from there

  • random_seed (int, optional) – Used if random_seed_generator is not given.

get_state()[source]

Together with clone_from_state(), this is needed in order to store and re-create the mutable state of the searcher. The state returned here must be pickle-able.

Return type:

Dict[str, Any]

Returns:

Pickle-able mutable state of searcher

set_random_state(random_state)[source]
class syne_tune.optimizer.schedulers.searchers.searcher_base.StochasticAndFilterDuplicatesSearcher(config_space, metric, points_to_evaluate=None, allow_duplicates=None, restrict_configurations=None, **kwargs)[source]

Bases: StochasticSearcher

Base class for searchers with the following properties:

  • Random decisions use common random_state

  • Maintains exclusion list to filter out duplicates in get_config() if allows_duplicates == False`. If this is ``True, duplicates are not filtered, and the exclusion list is used only to avoid configurations of failed trials.

  • If restrict_configurations is given, this is a list of configurations, and the searcher only suggests configurations from there. If allow_duplicates == False, entries are popped off this list once suggested. points_to_evaluate is filtered to only contain entries in this set.

In order to make use of these features:

  • Reject configurations in get_config() if should_not_suggest() returns True. If the configuration is drawn at random, use _get_random_config(), which incorporates this filtering

  • Implement _get_config() instead of get_config(). The latter adds the new config to the exclusion list if allow_duplicates == False

Note: Not all searchers which filter duplicates make use of this class.

Additional arguments on top of parent class StochasticSearcher:

Parameters:
  • allow_duplicates (Optional[bool]) – See above. Defaults to False

  • restrict_configurations (Optional[List[Dict[str, Any]]]) – See above, optional

property allow_duplicates: bool
should_not_suggest(config)[source]
Parameters:

config (Dict[str, Any]) – Configuration

Return type:

bool

Returns:

get_config() should not suggest this configuration?

get_config(**kwargs)[source]

Suggest a new configuration.

Note: Query _next_initial_config() for initial configs to return first.

Parameters:

kwargs – Extra information may be passed from scheduler to searcher

Return type:

Optional[Dict[str, Any]]

Returns:

New configuration. The searcher may return None if a new configuration cannot be suggested. In this case, the tuning will stop. This happens if searchers never suggest the same config more than once, and all configs in the (finite) search space are exhausted.

register_pending(trial_id, config=None, milestone=None)[source]

Signals to searcher that evaluation for trial has started, but not yet finished, which allows model-based searchers to register this evaluation as pending.

Parameters:
  • trial_id (str) – ID of trial to be registered as pending evaluation

  • config (Optional[Dict[str, Any]]) – If trial_id has not been registered with the searcher, its configuration must be passed here. Ignored otherwise.

  • milestone (Optional[int]) – For multi-fidelity schedulers, this is the next rung level the evaluation will attend, so that model registers (config, milestone) as pending.

evaluation_failed(trial_id)[source]

Called by scheduler if an evaluation job for a trial failed.

The searcher should react appropriately (e.g., remove pending evaluations for this trial, not suggest the configuration again).

Parameters:

trial_id (str) – ID of trial whose evaluated failed

get_state()[source]

Together with clone_from_state(), this is needed in order to store and re-create the mutable state of the searcher. The state returned here must be pickle-able.

Return type:

Dict[str, Any]

Returns:

Pickle-able mutable state of searcher