syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.learncurve.likelihood module

class syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.learncurve.likelihood.GaussAdditiveMarginalLikelihood(kernel, res_model, mean=None, initial_noise_variance=None, encoding_type=None, **kwargs)[source]

Bases: MarginalLikelihood

Marginal likelihood of joint learning curve model, where each curve is modelled as sum of a Gaussian process over x (for the value at r_max) and a Gaussian model over r.

The latter res_model is either an ISSM or another Gaussian process with exponential decay covariance function.

Parameters:
get_posterior_state(data)[source]
Return type:

PosteriorState

forward(data)[source]

Overrides to implement forward computation using NDArray. Only accepts positional arguments. Parameters ———- *args : list of NDArray

Input tensors.

param_encoding_pairs()[source]

Return a list of tuples with the Gluon parameters of the likelihood and their respective encodings

Return type:

List[tuple]

get_noise_variance(as_ndarray=False)[source]
get_params()[source]
Return type:

Dict[str, Any]

set_params(param_dict)[source]
data_precomputations(data, overwrite=False)[source]

Some models require precomputations based on data. Precomputed variables are appended to data. This is done only if not already included in data, unless overwrite is True.

Parameters:
  • data (Dict[str, Any]) –

  • overwrite (bool) –

on_fit_start(data)[source]

Called at the beginning of fit.

Parameters:

data (Dict[str, Any]) – Argument passed to fit