syne_tune.optimizer.schedulers.searchers.bayesopt.models.model_transformer module
- class syne_tune.optimizer.schedulers.searchers.bayesopt.models.model_transformer.StateForModelConverter[source]
Bases:
object
Interface for state converters (optionally) used in
ModelStateTransformer
. These are applied to a state before being passed to the model for fitting and predictions. The main use case is to filter down data if fitting the model scales super-linearly.
- class syne_tune.optimizer.schedulers.searchers.bayesopt.models.model_transformer.ModelStateTransformer(estimator, init_state, skip_optimization=None, state_converter=None)[source]
Bases:
object
This class maintains the
TuningJobState
object alongside an HPO experiment, and manages the reaction to changes of this state. In particular, it provides a fitted surrogate model on demand, which encapsulates the GP posterior.The state transformer is generic, it uses
Estimator
for anything specific to the model type.skip_optimization
is a predicate depending on the state, determining what is done at the next recent call ofmodel
. IfFalse
, the model parameters are refit, otherwise the current ones are not changed (which is usually faster, but risks stale-ness).We also track the observed data
state.trials_evaluations
. If this did not change since the last recentmodel()
call, we do not refit the model parameters. This is based on the assumption that model parameter fitting only depends onstate.trials_evaluations
(observed data), not on other fields (e.g., pending evaluations).If given,
state_converter
maps the state to another one which is then passed to the model for fitting and predictions. One important use case is filtering down data when model fitting is superlinear. Another is to convert multi-fidelity setups to be used with single-fidelity models inside.Note that
estimator
andskip_optimization
can also be a dictionary mapping output names to models. In that case, the state is shared but the models for each output metric are updated independently.- Parameters:
estimator (
Union
[Estimator
,Dict
[str
,Estimator
]]) – Surrogate model(s)init_state (
TuningJobState
) – Initial tuning job stateskip_optimization (
Union
[SkipOptimizationPredicate
,Dict
[str
,SkipOptimizationPredicate
],None
]) – Skip optimization predicate (see above). Defaults toNone
(fitting is never skipped)state_converter (
Optional
[StateForModelConverter
]) – See above, optional
- property state: TuningJobState
- property use_single_model: bool
- property skip_optimization: SkipOptimizationPredicate | Dict[str, SkipOptimizationPredicate]
- fit(**kwargs)[source]
If
skip_optimization
is given, it overrides theself._skip_optimization
predicate.
- append_trial(trial_id, config=None, resource=None)[source]
Appends new pending evaluation to the state.
- Parameters:
trial_id (
str
) – ID of trialconfig (
Optional
[Dict
[str
,Union
[int
,float
,str
]]]) – Must be given if this trial does not yet feature in the stateresource (
Optional
[int
]) – Must be given in the multi-fidelity case, to specify at which resource level the evaluation is pending
- drop_pending_evaluation(trial_id, resource=None)[source]
Drop pending evaluation from state. If it is not listed as pending, nothing is done
- Parameters:
trial_id (
str
) – ID of trialresource (
Optional
[int
]) – Must be given in the multi-fidelity case, to specify at which resource level the evaluation is pending
- Return type:
bool
- remove_observed_case(trial_id, metric_name='target', key=None)[source]
Removes specific observation from the state.
- Parameters:
trial_id (
str
) – ID of trialmetric_name (
str
) – Name of internal metrickey (
Optional
[str
]) – Must be given in the multi-fidelity case
- label_trial(data, config=None)[source]
Adds observed data for a trial. If it has observations in the state already,
data.metrics
are appended. Otherwise, a new entry is appended. If new observations replace pending evaluations, these are removed.config
must be passed if the trial has not yet been registered in the state (this happens normally with theappend_trial
call). If already registered,config
is ignored.
- filter_pending_evaluations(filter_pred)[source]
Filters
state.pending_evaluations
withfilter_pred
.- Parameters:
filter_pred (
Callable
[[PendingEvaluation
],bool
]) – Filtering predicate