syne_tune.optimizer.schedulers.searchers.bayesopt.models.estimator module

class syne_tune.optimizer.schedulers.searchers.bayesopt.models.estimator.Estimator[source]

Bases: object

Interface for surrogate models used in ModelStateTransformer.

In general, a surrogate model is probabilistic (or Bayesian), in that predictions are driven by a posterior distribution, represented in a posterior state of type Predictor. The model may also come with tunable (hyper)parameters, such as for example covariance function parameters for a Gaussian process model. These parameters can be accessed with get_params(), set_params().

get_params()[source]
Return type:

Dict[str, Any]

Returns:

Current tunable model parameters

set_params(param_dict)[source]
Parameters:

param_dict (Dict[str, Any]) – New model parameters

fit_from_state(state, update_params)[source]

Creates a Predictor object based on data in state. For a Bayesian model, this involves computing the posterior state, which is wrapped in the Predictor object.

If the model also has (hyper)parameters, these are learned iff update_params == True. Otherwise, these parameters are not changed, but only the posterior state is computed. The idea is that in general, model fitting is much more expensive than just creating the final posterior state (or predictor). It then makes sense to partly work with stale model parameters.

If your surrogate model is not Bayesian, or does not have hyperparameters, you can ignore the update_params argument,

Parameters:
  • state (TuningJobState) – Current data model parameters are to be fit on, and the posterior state is to be computed from

  • update_params (bool) – See above

Return type:

Predictor

Returns:

Predictor, wrapping the posterior state

property debug_log: DebugLogPrinter | None
configure_scheduler(scheduler)[source]

Called by configure_scheduler() of searchers which make use of an class:Estimator. Allows the estimator to depend on parameters of the scheduler.

Parameters:

scheduler – Scheduler object

class syne_tune.optimizer.schedulers.searchers.bayesopt.models.estimator.TransformedData(features, targets, mean, std)[source]

Bases: object

features: ndarray
targets: ndarray
mean: float
std: float
syne_tune.optimizer.schedulers.searchers.bayesopt.models.estimator.transform_state_to_data(state, active_metric=None, normalize_targets=True, num_fantasy_samples=1)[source]

Transforms TuningJobState object state to features and targets. The former are encoded vectors from state.hp_ranges. The latter are normalized to zero mean, unit variance if normalize_targets == True, in which case the original mean and stddev is also returned.

If state.pending_evaluations is not empty, it must contain entries of type FantasizedPendingEvaluation, which contain the fantasy samples. This is the case only for internal states.

Parameters:
  • state (TuningJobState) – TuningJobState to transform

  • active_metric (Optional[str]) – Name of target metric (optional)

  • normalize_targets (bool) – Normalize targets? Defaults to True

  • num_fantasy_samples (int) – Number of fantasy samples. Defaults to 1

Return type:

TransformedData

Returns:

Transformed data