syne_tune.tuning_status module

class syne_tune.tuning_status.MetricsStatistics[source]

Bases: object

Allows to maintain simple running statistics (min/max/sum/count) of metrics provided. Statistics are tracked for numeric types only. Types of first added metrics define its types.

add(metrics)[source]
class syne_tune.tuning_status.TuningStatus(metric_names)[source]

Bases: object

Information of a tuning job to display as progress or to use to decide whether to stop the tuning job.

Parameters:

metric_names (List[str]) – Names of metrics reported

update(trial_status_dict, new_results)[source]

Updates the tuning status given new statuses and results.

Parameters:
  • trial_status_dict (Dict[int, Tuple[Trial, str]]) – Dictionary mapping trial ID to Trial object and status

  • new_results (List[Tuple[int, dict]]) – New results, along with trial IDs

mark_running_job_as_stopped()[source]

Update the status of all trials still running to be marked as stop.

property num_trials_started
Returns:

Number of trials which have been started

property num_trials_completed
Returns:

Number of trials which have been completed

property num_trials_failed
Returns:

Number of trials which have failed

property num_trials_finished
Returns:

Number of trials that finished, e.g. that completed, were stopped or are stopping, or failed

property num_trials_running
Returns:

Number of trials currently running

property wallclock_time
Returns:

the wallclock time spent in the tuner

property user_time
Returns:

the total user time spent in the workers

property cost
Returns:

the estimated dollar-cost spent while tuning

get_dataframe()[source]
Return type:

DataFrame

Returns:

Information about all trials as dataframe

syne_tune.tuning_status.print_best_metric_found(tuning_status, metric_names, mode=None)[source]

Prints trial status summary and the best metric found.

Parameters:
  • tuning_status (TuningStatus) – Current tuning status

  • metric_names (List[str]) – Plot results for first metric in this list

  • mode (Optional[str]) – “min” or “max”

Return type:

Optional[Tuple[int, float]]

Returns:

trial-id and value of the best metric found