syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.constants module
- class syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.constants.OptimizationConfig(lbfgs_tol, lbfgs_maxiter, verbose, n_starts)[source]
Bases:
object
-
lbfgs_tol:
float
-
lbfgs_maxiter:
int
-
verbose:
bool
-
n_starts:
int
-
lbfgs_tol:
- class syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.constants.MCMCConfig(n_samples, n_burnin, n_thinning)[source]
Bases:
object
n_samples
is the total number of samples drawn. The firstn_burnin
of these are dropped (burn-in), and everyn_thinning
of the rest is returned. This means we return(n_samples - n_burnin) // n_thinning
samples.-
n_samples:
int
-
n_burnin:
int
-
n_thinning:
int
-
n_samples: