syne_tune.optimizer.schedulers.searchers.searcher_base module
- syne_tune.optimizer.schedulers.searchers.searcher_base.extract_random_seed(**kwargs)[source]
- Return type:
(
int
,Dict
[str
,Any
])
- syne_tune.optimizer.schedulers.searchers.searcher_base.sample_random_configuration(hp_ranges, random_state, exclusion_list=None)[source]
Samples a configuration from
config_space
at random.- Parameters:
hp_ranges (
HyperparameterRanges
) – Used for sampling configurationsrandom_state (
RandomState
) – PRN generatorexclusion_list (
Optional
[ExclusionList
]) – Configurations not to be returned
- Return type:
Optional
[Dict
[str
,Any
]]- Returns:
New configuration, or
None
if configuration space has been exhausted
- class syne_tune.optimizer.schedulers.searchers.searcher_base.StochasticSearcher(config_space, metric, points_to_evaluate=None, **kwargs)[source]
Bases:
LegacyBaseSearcher
Base class of searchers which use random decisions. Creates the
random_state
member, which must be used for all random draws.Making proper use of this interface allows us to run experiments with control of random seeds, e.g. for paired comparisons or integration testing.
Additional arguments on top of parent class
BaseSearcher
:- Parameters:
random_seed_generator (
RandomSeedGenerator
, optional) – If given, random seed is drawn from thererandom_seed (int, optional) – Used if
random_seed_generator
is not given.
- class syne_tune.optimizer.schedulers.searchers.searcher_base.StochasticAndFilterDuplicatesSearcher(config_space, metric, points_to_evaluate=None, allow_duplicates=None, restrict_configurations=None, **kwargs)[source]
Bases:
StochasticSearcher
Base class for searchers with the following properties:
Random decisions use common
random_state
Maintains exclusion list to filter out duplicates in
get_config()
ifallows_duplicates == False`. If this is ``True
, duplicates are not filtered, and the exclusion list is used only to avoid configurations of failed trials.If
restrict_configurations
is given, this is a list of configurations, and the searcher only suggests configurations from there. Ifallow_duplicates == False
, entries are popped off this list once suggested.points_to_evaluate
is filtered to only contain entries in this set.
In order to make use of these features:
Reject configurations in
get_config()
ifshould_not_suggest()
returnsTrue
. If the configuration is drawn at random, use_get_random_config()
, which incorporates this filteringImplement
_get_config()
instead ofget_config()
. The latter adds the new config to the exclusion list ifallow_duplicates == False
Note: Not all searchers which filter duplicates make use of this class.
Additional arguments on top of parent class
StochasticSearcher
:- Parameters:
allow_duplicates (
Optional
[bool
]) – See above. Defaults toFalse
restrict_configurations (
Optional
[List
[Dict
[str
,Any
]]]) – See above, optional
- property allow_duplicates: bool
- should_not_suggest(config)[source]
- Parameters:
config (
Dict
[str
,Any
]) – Configuration- Return type:
bool
- Returns:
get_config()
should not suggest this configuration?
- get_config(**kwargs)[source]
Suggest a new configuration.
Note: Query
_next_initial_config()
for initial configs to return first.- Parameters:
kwargs – Extra information may be passed from scheduler to searcher
- Return type:
Optional
[Dict
[str
,Any
]]- Returns:
New configuration. The searcher may return None if a new configuration cannot be suggested. In this case, the tuning will stop. This happens if searchers never suggest the same config more than once, and all configs in the (finite) search space are exhausted.
- register_pending(trial_id, config=None, milestone=None)[source]
Signals to searcher that evaluation for trial has started, but not yet finished, which allows model-based searchers to register this evaluation as pending.
- Parameters:
trial_id (
str
) – ID of trial to be registered as pending evaluationconfig (
Optional
[Dict
[str
,Any
]]) – Iftrial_id
has not been registered with the searcher, its configuration must be passed here. Ignored otherwise.milestone (
Optional
[int
]) – For multi-fidelity schedulers, this is the next rung level the evaluation will attend, so that model registers(config, milestone)
as pending.