syne_tune.optimizer.schedulers.searchers.bayesopt.models.subsample_state_single_fidelity module
- syne_tune.optimizer.schedulers.searchers.bayesopt.models.subsample_state_single_fidelity.cap_size_tuning_job_state(state, max_size, mode, top_fraction, random_state=None)[source]
Returns state which is identical to
state
, except that thetrials_evaluations
are replaced by a subset so the total number of metric values is<= max_size
.- Parameters:
state (
TuningJobState
) – Original state to filter downmax_size (
int
) – Maximum number of observed metric values in new statemode (
str
) – “min” or “max”top_fraction (
float
) – See aboverandom_state (
Optional
[RandomState
]) – Used for random sampling. Defaults tonumpy.random
.
- Return type:
- Returns:
New state meeting the
max_size
constraint. This is a copy ofstate
even if this meets the constraint already.
- class syne_tune.optimizer.schedulers.searchers.bayesopt.models.subsample_state_single_fidelity.SubsampleSingleFidelityStateConverter(max_size, mode, top_fraction, random_state=None)[source]
Bases:
StateForModelConverter
Converts state by (possibly) down sampling the observation so that their total number is
<= max_size
. Iflen(state) > max_size
, the subset is sampled as follows.max_size * top_fraction
is filled with the best observations. The remainder is sampled without replacement from the remaining observations.- Parameters:
max_size (
int
) – Maximum number of observed metric values in new statemode (
str
) – “min” or “max”top_fraction (
float
) – See aboverandom_state (
Optional
[RandomState
]) – Used for random sampling. Can also be set withset_random_state()