syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.kernel.freeze_thaw module

class syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.kernel.freeze_thaw.FreezeThawKernelFunction(kernel_x, mean_x, encoding_type='logarithm', alpha_init=1.0, mean_lam_init=0.5, gamma_init=0.5, max_metric_value=1.0, **kwargs)[source]

Bases: KernelFunction

Variant of the kernel function for modeling exponentially decaying learning curves, proposed in:

Swersky, K., Snoek, J., & Adams, R. P. (2014). Freeze-Thaw Bayesian Optimization. ArXiv:1406.3896 [Cs, Stat). Retrieved from http://arxiv.org/abs/1406.3896

The argument in that paper actually justifies using a non-zero mean function (see ExponentialDecayResourcesMeanFunction) and centralizing the kernel proposed there. This is done here.

As in the Freeze-Thaw paper, learning curves for different configs are conditionally independent.

This class is configured with a kernel and a mean function over inputs x (dimension d) and represents a kernel (and mean function) over inputs (x, r) (dimension d + 1), where the resource attribute r >= 0 is last.

Note: This kernel is mostly for debugging! Its conditional independence assumptions allow for faster inference, as implemented in GaussProcExpDecayPosteriorState.

forward(X1, X2, **kwargs)[source]

Overrides to implement forward computation using NDArray. Only accepts positional arguments. Parameters ———- *args : list of NDArray

Input tensors.

diagonal(X)[source]
Parameters:

X – Input data, shape (n, d)

Returns:

Diagonal of \(k(X, X)\), shape (n,)

diagonal_depends_on_X()[source]

For stationary kernels, diagonal does not depend on X

Returns:

Does diagonal() depend on X?

param_encoding_pairs()[source]
Returns list of tuples

(param_internal, encoding)

over all Gluon parameters maintained here.

Returns:

List [(param_internal, encoding)]

mean_function(X)[source]
get_params()[source]

Parameter keys are alpha, mean_lam, gamma, delta (only if not fixed to delta_fixed_value), as well as those of self.kernel_x (prefix ‘kernelx_’) and of self.mean_x (prefix ‘meanx_’).

Return type:

Dict[str, Any]

set_params(param_dict)[source]
Parameters:

param_dict (Dict[str, Any]) – Dictionary with new hyperparameter values

Returns:

class syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.kernel.freeze_thaw.FreezeThawMeanFunction(kernel, **kwargs)[source]

Bases: MeanFunction

forward(X)[source]

Overrides to implement forward computation using NDArray. Only accepts positional arguments. Parameters ———- *args : list of NDArray

Input tensors.

param_encoding_pairs()[source]
Returns list of tuples

(param_internal, encoding)

over all Gluon parameters maintained here.

Returns:

List [(param_internal, encoding)]

get_params()[source]
Return type:

Dict[str, Any]

Returns:

Dictionary with hyperparameter values

set_params(param_dict)[source]
Parameters:

param_dict (Dict[str, Any]) – Dictionary with new hyperparameter values

Returns: