syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.kernel.exponential_decay module
- class syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.kernel.exponential_decay.ExponentialDecayResourcesKernelFunction(kernel_x, mean_x, encoding_type='logarithm', alpha_init=1.0, mean_lam_init=0.5, gamma_init=0.5, delta_fixed_value=None, delta_init=0.5, max_metric_value=1.0, **kwargs)[source]
Bases:
KernelFunction
Variant of the kernel function for modeling exponentially decaying learning curves, proposed in:
Swersky, K., Snoek, J., & Adams, R. P. (2014).Freeze-Thaw Bayesian Optimization.The argument in that paper actually justifies using a non-zero mean function (see
ExponentialDecayResourcesMeanFunction
) and centralizing the kernel proposed there. This is done here. Details in:Tiao, Klein, Archambeau, Seeger (2020)Model-based Asynchronous Hyperparameter OptimizationWe implement a new family of kernel functions, for which the additive Freeze-Thaw kernel is one instance (
delta == 0
). The kernel has parametersalpha
,mean_lam
,gamma > 0
, and0 <= delta <= 1
. Note thatbeta = alpha / mean_lam
is used in the Freeze-Thaw paper (the Gamma distribution overlambda
is parameterized differently). The additive Freeze-Thaw kernel is obtained fordelta == 0
(usedelta_fixed_value = 0
).In fact, this class is configured with a kernel and a mean function over inputs
x
(dimensiond
) and represents a kernel (and mean function) over inputs(x, r)
(dimensiond + 1
), where the resource attributer >= 0
is last.- forward(X1, X2, **kwargs)[source]
Overrides to implement forward computation using
NDArray
. Only accepts positional arguments. Parameters ———- *args : list of NDArrayInput tensors.
- diagonal(X)[source]
- Parameters:
X – Input data, shape
(n, d)
- Returns:
Diagonal of \(k(X, X)\), shape
(n,)
- diagonal_depends_on_X()[source]
For stationary kernels, diagonal does not depend on
X
- Returns:
Does
diagonal()
depend onX
?
- param_encoding_pairs()[source]
- Returns list of tuples
(param_internal, encoding)
over all Gluon parameters maintained here.
- Returns:
List [(param_internal, encoding)]
- class syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.kernel.exponential_decay.ExponentialDecayResourcesMeanFunction(kernel, **kwargs)[source]
Bases:
MeanFunction
- forward(X)[source]
Overrides to implement forward computation using
NDArray
. Only accepts positional arguments. Parameters ———- *args : list of NDArrayInput tensors.