syne_tune.optimizer.schedulers.searchers.bayesopt.models.subsample_state_single_fidelity module

syne_tune.optimizer.schedulers.searchers.bayesopt.models.subsample_state_single_fidelity.cap_size_tuning_job_state(state, max_size, mode, top_fraction, random_state=None)[source]

Returns state which is identical to state, except that the trials_evaluations are replaced by a subset so the total number of metric values is <= max_size.

Parameters:
  • state (TuningJobState) – Original state to filter down

  • max_size (int) – Maximum number of observed metric values in new state

  • mode (str) – “min” or “max”

  • top_fraction (float) – See above

  • random_state (Optional[RandomState]) – Used for random sampling. Defaults to numpy.random.

Return type:

TuningJobState

Returns:

New state meeting the max_size constraint. This is a copy of state even if this meets the constraint already.

class syne_tune.optimizer.schedulers.searchers.bayesopt.models.subsample_state_single_fidelity.SubsampleSingleFidelityStateConverter(max_size, mode, top_fraction, random_state=None)[source]

Bases: StateForModelConverter

Converts state by (possibly) down sampling the observation so that their total number is <= max_size. If len(state) > max_size, the subset is sampled as follows. max_size * top_fraction is filled with the best observations. The remainder is sampled without replacement from the remaining observations.

Parameters:
  • max_size (int) – Maximum number of observed metric values in new state

  • mode (str) – “min” or “max”

  • top_fraction (float) – See above

  • random_state (Optional[RandomState]) – Used for random sampling. Can also be set with set_random_state()

set_random_state(random_state)[source]

Some state converters use random sampling. For these, the random state has to be set before first usage.

Parameters:

random_state (RandomState) – Random state to be used internally