syne_tune.optimizer.schedulers.ray_scheduler module
- class syne_tune.optimizer.schedulers.ray_scheduler.RayTuneScheduler(config_space, ray_scheduler=None, ray_searcher=None, points_to_evaluate=None)[source]
Bases:
TrialScheduler
Allow using Ray scheduler and searcher. Any searcher/scheduler should work, except such which need access to
TrialRunner
(e.g., PBT), this feature is not implemented in Syne Tune.If
ray_searcher
is not given (defaults to random searcher), initial configurations to evaluate can be passed inpoints_to_evaluate
. Ifray_searcher
is given, this argument is ignored (needs to be passed toray_searcher
at construction). Note: Useimpute_points_to_evaluate()
in order to preprocesspoints_to_evaluate
specified by the user or the benchmark.- Parameters:
config_space (
Dict
) – Configuration spaceray_scheduler – Ray scheduler, defaults to FIFO scheduler
ray_searcher (
Optional
[Searcher
]) – Ray searcher, defaults to random searchpoints_to_evaluate (
Optional
[List
[Dict
]]) – See above
- RT_FIFOScheduler
alias of
FIFOScheduler
- RT_Searcher
alias of
Searcher
- class RandomSearch(config_space, points_to_evaluate, mode)[source]
Bases:
Searcher
- suggest(trial_id)[source]
Queries the algorithm to retrieve the next set of parameters.
- Return type:
Optional
[Dict
]
- Arguments:
trial_id: Trial ID used for subsequent notifications.
- Returns:
- dict | FINISHED | None: Configuration for a trial, if possible.
If FINISHED is returned, Tune will be notified that no more suggestions/configurations will be provided. If None is returned, Tune will skip the querying of the searcher for this step.
- on_trial_complete(trial_id, result=None, error=False)[source]
Notification for the completion of trial.
Typically, this method is used for notifying the underlying optimizer of the result.
- Args:
trial_id: A unique string ID for the trial. result: Dictionary of metrics for current training progress.
Note that the result dict may include NaNs or may not include the optimization metric. It is up to the subclass implementation to preprocess the result to avoid breaking the optimization process. Upon errors, this may also be None.
error: True if the training process raised an error.
- on_trial_add(trial)[source]
Called when a new trial is added to the trial runner.
Additions are normally triggered by
suggest
.- Parameters:
trial (
Trial
) – Trial to be added
- on_trial_error(trial)[source]
Called when a trial has failed.
- Parameters:
trial (
Trial
) – Trial for which error is reported.
- on_trial_result(trial, result)[source]
Called on each intermediate result reported by a trial.
At this point, the trial scheduler can make a decision by returning one of
SchedulerDecision.CONTINUE
,SchedulerDecision.PAUSE
, orSchedulerDecision.STOP
. This will only be called when the trial is currently running.- Parameters:
trial (
Trial
) – Trial for which results are reportedresult (
Dict
) – Result dictionary
- Return type:
str
- Returns:
Decision what to do with the trial
- on_trial_complete(trial, result)[source]
Notification for the completion of trial.
Note that
on_trial_result()
is called with the same result before. However, if the scheduler only uses one final report from each trial, it may ignoreon_trial_result()
and just useresult
here.- Parameters:
trial (
Trial
) – Trial which is completingresult (
Dict
) – Result dictionary
- on_trial_remove(trial)[source]
Called to remove trial.
This is called when the trial is in PAUSED or PENDING state. Otherwise, call
on_trial_complete()
.- Parameters:
trial (
Trial
) – Trial to be removed
- metric_names()[source]
- Return type:
List
[str
]- Returns:
List of metric names. The first one is the target metric optimized over, unless the scheduler is a genuine multi-objective metric (for example, for sampling the Pareto front)
- metric_mode()[source]
- Return type:
str
- Returns:
“min” if target metric is minimized, otherwise “max”. Here, “min” should be the default. For a genuine multi-objective scheduler, a list of modes is returned
- static convert_config_space(config_space)[source]
Converts config_space from our type to the one of Ray Tune.
Note:
randint(lower, upper)
in Ray Tune has exclusiveupper
, while this is inclusive for us. On the other hand,lograndint(lower, upper)
has inclusiveupper
in Ray Tune as well.- Parameters:
config_space – Configuration space
- Returns:
config_space
converted into Ray Tune type