syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.gpr_mcmc module

class syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.gpr_mcmc.GPRegressionMCMC(build_kernel, mcmc_config=MCMCConfig(n_samples=300, n_burnin=250, n_thinning=5), random_seed=None)[source]

Bases: GaussianProcessModel

property states: List[GaussProcPosteriorState] | None
Returns:

Current posterior states (one per MCMC sample; just a single state if model parameters are optimized)

property number_samples: int
fit(data)[source]

Adjust model parameters based on training data data. Can be done via optimization or MCMC sampling. The posterior states are computed at the end as well.

Parameters:

data (Dict[str, Any]) – Training data

recompute_states(data)[source]

Supports fantasizing, in that targets can be a matrix. Then, ycols = targets.shape[1] must be a multiple of self.number_samples.