syne_tune.optimizer.schedulers.searchers.bayesopt.models.meanstd_acqfunc_impl module

class syne_tune.optimizer.schedulers.searchers.bayesopt.models.meanstd_acqfunc_impl.EIAcquisitionFunction(predictor, active_metric=None, jitter=0.01, debug_collect_stats=False)[source]

Bases: MeanStdAcquisitionFunction

Minus expected improvement acquisition function (minus because the convention is to always minimize acquisition functions)

debug_stats_message()[source]
Return type:

str

class syne_tune.optimizer.schedulers.searchers.bayesopt.models.meanstd_acqfunc_impl.LCBAcquisitionFunction(predictor, kappa, active_metric=None)[source]

Bases: MeanStdAcquisitionFunction

Lower confidence bound (LCB) acquisition function:

\[h(\mu, \sigma) = \mu - \kappa * \sigma\]
class syne_tune.optimizer.schedulers.searchers.bayesopt.models.meanstd_acqfunc_impl.EIpuAcquisitionFunction(predictor, active_metric=None, exponent_cost=1.0, jitter=0.01)[source]

Bases: MeanStdAcquisitionFunction

Minus cost-aware expected improvement acquisition function.

This is defined as

\[\mathrm{EIpu}(x) = \frac{\mathrm{EI(x)}}{\mathrm{power}(\mathrm{cost}(x), \mathrm{exponent_cost})},\]

where \(\mathrm{EI}(x)\) is expected improvement, \(\mathrm{cost}(x)\) is the predictive mean of a cost model, and exponent_cost is an exponent in \((0, 1]\).

exponent_cost scales the influence of the cost term on the acquisition function. See also:

Lee etal.
Cost-aware Bayesian Optimization

Note: two metrics are expected in the model output: the main objective and the cost. The main objective needs to be indicated as active_metric when initializing EIpuAcquisitionFunction. The cost is automatically assumed to be the other metric.

Parameters:
  • predictor (Union[Predictor, Dict[str, Predictor]]) – Predictors for main objective and cost

  • active_metric (Optional[str]) – Name of main objective

  • exponent_cost (float) – Exponent for cost in denominator. Defaults to 1

  • jitter (float) – Jitter factor, must be positive. Defaults to 0.01

class syne_tune.optimizer.schedulers.searchers.bayesopt.models.meanstd_acqfunc_impl.ConstraintCurrentBestProvider(current_best_list, num_samples_active)[source]

Bases: CurrentBestProvider

Here, current_best depends on two predictors, for active and constraint metric.

class syne_tune.optimizer.schedulers.searchers.bayesopt.models.meanstd_acqfunc_impl.CEIAcquisitionFunction(predictor, active_metric=None, jitter=0.01)[source]

Bases: MeanStdAcquisitionFunction

Minus constrained expected improvement acquisition function. (minus because the convention is to always minimize the acquisition function)

This is defined as CEI(x) = EI(x) * P(c(x) <= 0), where EI is the standard expected improvement with respect to the current feasible best, and P(c(x) <= 0) is the probability that the hyperparameter configuration x satisfies the constraint modeled by c(x).

If there are no feasible hyperparameters yet, the current feasible best is undefined. Thus, CEI is reduced to the P(c(x) <= 0) term until a feasible configuration is found.

Two metrics are expected in the model output: the main objective and the constraint metric. The main objective needs to be indicated as active_metric when initializing CEIAcquisitionFunction. The constraint is automatically assumed to be the other metric.

References on CEI:

Gardner et al.
Bayesian Optimization with Inequality Constraints
ICML 2014

and

Gelbart et al.
Bayesian Optimization with Unknown Constraints
UAI 2014.
Parameters:
  • predictor (Union[Predictor, Dict[str, Predictor]]) – Predictors for main objective and cost

  • active_metric (Optional[str]) – Name of main objective

  • jitter (float) – Jitter factor, must be positive. Defaults to 0.01

syne_tune.optimizer.schedulers.searchers.bayesopt.models.meanstd_acqfunc_impl.get_quantiles(acquisition_par, fmin, m, s)[source]

Quantiles of the Gaussian distribution, useful to determine the acquisition function values.

Parameters:
  • acquisition_par – parameter of the acquisition function

  • fmin – current minimum.

  • m – vector of means.

  • s – vector of standard deviations.