syne_tune.optimizer.schedulers.searchers.random_grid_searcher module
- class syne_tune.optimizer.schedulers.searchers.random_grid_searcher.RandomSearcher(config_space, metric, points_to_evaluate=None, debug_log=False, resource_attr=None, allow_duplicates=None, restrict_configurations=None, **kwargs)[source]
Bases:
StochasticAndFilterDuplicatesSearcher
Searcher which randomly samples configurations to try next.
Additional arguments on top of parent class
StochasticAndFilterDuplicatesSearcher
:- Parameters:
debug_log (
Union
[bool
,DebugLogPrinter
]) – IfTrue
, debug log printing is activated. Logs which configs are chosen when, and which metric values are obtained. Defaults toFalse
resource_attr (
Optional
[str
]) – Optional. Key inresult
passed to_update()
for resource value (for multi-fidelity schedulers)
- configure_scheduler(scheduler)[source]
Some searchers need to obtain information from the scheduler they are used with, in order to configure themselves. This method has to be called before the searcher can be used.
- Parameters:
scheduler (
TrialScheduler
) – Scheduler the searcher is used with.
- clone_from_state(state)[source]
Together with
get_state()
, this is needed in order to store and re-create the mutable state of the searcher.Given state as returned by
get_state()
, this method combines the non-pickle-able part of the immutable state from self with state and returns the corresponding searcher clone. Afterwards,self
is not used anymore.- Parameters:
state (
Dict
[str
,Any
]) – See above- Returns:
New searcher object
- property debug_log
Some subclasses support writing a debug log, using
DebugLogPrinter
. SeeRandomSearcher
for an example.- Returns:
debug_log
object`` or None (not supported)
- class syne_tune.optimizer.schedulers.searchers.random_grid_searcher.GridSearcher(config_space, metric, points_to_evaluate=None, num_samples=None, shuffle_config=True, allow_duplicates=False, **kwargs)[source]
Bases:
StochasticSearcher
Searcher that samples configurations from an equally spaced grid over config_space.
It first evaluates configurations defined in points_to_evaluate and then continues with the remaining points from the grid.
Additional arguments on top of parent class
StochasticSearcher
.- Parameters:
num_samples (
Optional
[Dict
[str
,int
]]) – Dictionary, optional. Number of samples per hyperparameter. This is required for hyperparameters of type float, optional for integer hyperparameters, and will be ignored for other types (categorical, scalar). If left unspecified, a default value ofDEFAULT_NSAMPLE
will be used for float parameters, and the smallest ofDEFAULT_NSAMPLE
and integer range will be used for integer parameters.shuffle_config (
bool
) – IfTrue
(default), the order of configurations suggested after those specified inpoints_to_evaluate
is shuffled. Otherwise, the order will follow the Cartesian product of the configurations.allow_duplicates (
bool
) – IfTrue
,get_config()
may return the same configuration more than once. Defaults toFalse
- get_config(**kwargs)[source]
Select the next configuration from the grid.
This is done without replacement, so previously returned configs are not suggested again.
- Return type:
Optional
[dict
]- Returns:
A new configuration that is valid, or None if no new config can be suggested. The returned configuration is a dictionary that maps hyperparameters to its values.
- get_state()[source]
Together with
clone_from_state()
, this is needed in order to store and re-create the mutable state of the searcher. The state returned here must be pickle-able.- Return type:
Dict
[str
,Any
]- Returns:
Pickle-able mutable state of searcher
- clone_from_state(state)[source]
Together with
get_state()
, this is needed in order to store and re-create the mutable state of the searcher.Given state as returned by
get_state()
, this method combines the non-pickle-able part of the immutable state from self with state and returns the corresponding searcher clone. Afterwards,self
is not used anymore.- Parameters:
state (
Dict
[str
,Any
]) – See above- Returns:
New searcher object