syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.distribution module

class syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.distribution.Distribution[source]

Bases: object

negative_log_density(x)[source]

Negative log density. lower and upper limits are ignored. If x is not a scalar, the distribution is i.i.d. over all entries.

class syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.distribution.Gamma(mean, alpha)[source]

Bases: Distribution

Gamma(mean, alpha):

p(x) = C(alpha, beta) x^{alpha - 1} exp( -beta x), beta = alpha / mean, C(alpha, beta) = beta^alpha / Gamma(alpha)

negative_log_density(x)[source]

Negative log density. lower and upper limits are ignored. If x is not a scalar, the distribution is i.i.d. over all entries.

class syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.distribution.Uniform(lower, upper)[source]

Bases: Distribution

negative_log_density(x)[source]

Negative log density. lower and upper limits are ignored. If x is not a scalar, the distribution is i.i.d. over all entries.

class syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.distribution.Normal(mean, sigma)[source]

Bases: Distribution

negative_log_density(x)[source]

Negative log density. lower and upper limits are ignored. If x is not a scalar, the distribution is i.i.d. over all entries.

class syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.distribution.LogNormal(mean, sigma)[source]

Bases: Distribution

negative_log_density(x)[source]

Negative log density. lower and upper limits are ignored. If x is not a scalar, the distribution is i.i.d. over all entries.

class syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.distribution.Horseshoe(s)[source]

Bases: Distribution

negative_log_density(x)[source]

Negative log density. lower and upper limits are ignored. If x is not a scalar, the distribution is i.i.d. over all entries.