syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.gp_model module

class syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.gp_model.GaussianProcessModel(random_seed=None)[source]

Bases: object

Base class for Gaussian-linear models which support parameter fitting and prediction.

property random_state: RandomState
property states: List[PosteriorState] | None
Returns:

Current posterior states (one per MCMC sample; just a single state if model parameters are optimized)

fit(data)[source]

Adjust model parameters based on training data data. Can be done via optimization or MCMC sampling. The posterior states are computed at the end as well.

Parameters:

data (Dict[str, Any]) – Training data

recompute_states(data)[source]

Recomputes posterior states for current model parameters.

Parameters:

data (Dict[str, Any]) – Training data

predict(features_test)[source]

Compute the posterior mean(s) and variance(s) for the points in features_test. If the posterior state is based on m target vectors, a (n, m) matrix is returned for posterior means.

Parameters:

features_test (ndarray) – Data matrix X_test of size (n, d) (type np.ndarray) for which n predictions are made

Returns:

posterior_means, posterior_variances

multiple_targets()[source]
Returns:

Posterior state based on multiple (fantasized) target

sample_marginals(features_test, num_samples=1)[source]

Draws marginal samples from predictive distribution at n test points. Notice we concat the samples for each state. Let n_states = len(self._states)

If the posterior state is based on m > 1 target vectors, a (n, m, num_samples * n_states) tensor is returned, for m == 1 we return a (n, num_samples * n_states) matrix.

Parameters:
  • features_test (ndarray) – Test input points, shape (n, d)

  • num_samples (int) – Number of samples

Returns:

Samples with shape (n, num_samples * n_states) or (n, m, num_samples * n_states) if m > 1

sample_joint(features_test, num_samples=1)[source]

Draws joint samples from predictive distribution at n test points. This scales cubically with n. the posterior state must be based on a single target vector (m > 1 is not supported).

Parameters:
  • features_test (ndarray) – Test input points, shape (n, d)

  • num_samples (int) – Number of samples

Returns:

Samples, shape (n, num_samples)

class syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.gp_model.GaussianProcessOptimizeModel(optimization_config=None, random_seed=None, fit_reset_params=True)[source]

Bases: GaussianProcessModel

Base class for models where parameters are fit by maximizing the marginal likelihood.

property states: List[PosteriorState] | None
Returns:

Current posterior states (one per MCMC sample; just a single state if model parameters are optimized)

property likelihood: MarginalLikelihood
fit(data)[source]

Fit the model parameters by optimizing the marginal likelihood, and set posterior states.

We catch exceptions during the optimization restarts. If any restarts fail, log messages are written. If all restarts fail, the current parameters are not changed.

Parameters:

data (Dict[str, Any]) – Input data

recompute_states(data)[source]

Recomputes posterior states for current model parameters.

Parameters:

data (Dict[str, Any]) – Training data

get_params()[source]
Return type:

Dict[str, Any]

set_params(param_dict)[source]
reset_params()[source]

Reset hyperparameters to their initial values (or resample them).