syne_tune.optimizer.schedulers.searchers.bayesopt.models.meanstd_acqfunc module
- class syne_tune.optimizer.schedulers.searchers.bayesopt.models.meanstd_acqfunc.HeadWithGradient(hval, gradient)[source]
Bases:
object
gradient
maps each output model to a dict of head gradients, whose keys are those used bypredict
(e.g.,mean
,std
)-
hval:
ndarray
-
gradient:
Dict
[str
,Dict
[str
,ndarray
]]
-
hval:
- class syne_tune.optimizer.schedulers.searchers.bayesopt.models.meanstd_acqfunc.CurrentBestProvider[source]
Bases:
object
Helper class for
MeanStdAcquisitionFunction
. Thecurrent_best
values required incompute_acq()
andcompute_acq_with_gradient()
may depend on the MCMC sample index for each model (if none of the models use MCMC, this index is always(0, 0, ..., 0)
).
- class syne_tune.optimizer.schedulers.searchers.bayesopt.models.meanstd_acqfunc.NoneCurrentBestProvider[source]
Bases:
CurrentBestProvider
- class syne_tune.optimizer.schedulers.searchers.bayesopt.models.meanstd_acqfunc.ActiveMetricCurrentBestProvider(active_metric_current_best)[source]
Bases:
CurrentBestProvider
Default implementation in which
current_best
depends on the active metric only.
- class syne_tune.optimizer.schedulers.searchers.bayesopt.models.meanstd_acqfunc.MeanStdAcquisitionFunction(predictor, active_metric=None)[source]
Bases:
AcquisitionFunction
Base class for standard acquisition functions which depend on predictive mean and stddev. Subclasses have to implement the head and its derivatives w.r.t. mean and std:
\[f(x, \mathrm{model}) = h(\mathrm{mean}, \mathrm{std}, \mathrm{model.current_best}())\]If model is a
Predictor
, then active_metric is ignored. If model is adict
mapping output names to models, then active_metric must be given.Note that acquisition functions will always be minimized!
- compute_acq(inputs, predictor=None)[source]
Note: If inputs has shape
(d,)
, it is taken to be(1, d)