syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.gluon_blocks_helpers module

class syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.gluon_blocks_helpers.ConstantPositiveVector(param_name, encoding, size_cols, **kwargs)[source]

Bases: Block

Represents constant vector, with positive entry value represented as Gluon parameter, to be used in the context of wrapper classes in gluon_blocks.py. Shape, dtype, and context are determined from the features argument:

  • If features.shape = (n, d): shape = (d, 1) if size_cols = True (number cols of features) shape = (n, 1) if size_cols = False (number rows of features)

  • dtype = features.dtype, ctx = features.ctx

Encoding and internal Gluon parameter: The positive scalar parameter is encoded via encoding (see ScalarEncodingBase). The internal Gluon parameter (before encoding) has the name param_name + "_internal".

forward(features, param_internal)[source]

Returns constant positive vector

If features.shape = (n, d), the shape of the vector returned is (d, 1) if size_cols = True, (n, 1) otherwise.

Parameters:
  • features – Matrix for shape, dtype, ctx

  • param_internal – Unwrapped parameter

Returns:

Constant positive vector

set(val)[source]
get()[source]
get_box_constraints_internal()[source]
log_parameters()[source]
get_parameters()[source]
switch_updating(flag)[source]

Is the underlying parameter updated during learning?

By default, the parameter takes part in learning (its grad_req attribute is ‘write’). For flag == False, the attribute is flipped to ‘null’, and the parameter remains constant during learning.

Parameters:

flag – Update parameter during learning?

has_regularizer()[source]
eval_regularizer(features)[source]
class syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.gluon_blocks_helpers.PositiveScalarEncoding(lower, constr_upper=None, init_val=None, regularizer=None, dimension=1)[source]

Bases: ScalarEncodingBase

Provides encoding for positive scalar and vector: param > lower. Here, param is represented as gluon.Parameter. The param is with shape (dimension,) where dimension is 1 by default.

The encoding is given as:

param = softrelu(param_internal) + lower, softrelu(x) = log(1 + exp(x))

If constr_upper is used, the constraint

param_internal < dec(constr_upper)

can be enforced by an optimizer. Since dec is increasing, this translates to param < constr_upper. Note: While lower is enforced by the encoding, the upper bound is not, has to be enforced by an optimizer.

get(param_internal)[source]
decode(val, name)[source]
class syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.gluon_blocks_helpers.IdentityScalarEncoding(constr_lower=None, constr_upper=None, init_val=None, regularizer=None, dimension=1)[source]

Bases: ScalarEncodingBase

Identity encoding for scalar and vector:

param = param_internal

This does not ensure that param is positive! Use this only if positivity is otherwise guaranteed.

get(param_internal)[source]
decode(val, name)[source]
class syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.gluon_blocks_helpers.LogarithmScalarEncoding(constr_lower=None, constr_upper=None, init_val=None, regularizer=None, dimension=1)[source]

Bases: ScalarEncodingBase

Logarithmic encoding for scalar and vector:

param = exp(param_internal), param_internal = param

get(param_internal)[source]
decode(val, name)[source]
syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.gluon_blocks_helpers.unwrap_parameter(param_internal, some_arg=None)[source]
syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.gluon_blocks_helpers.encode_unwrap_parameter(param_internal, encoding, some_arg=None)[source]
syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.gluon_blocks_helpers.param_to_pretty_string(gluon_param, encoding)[source]

Take a gluon parameter and transform it to a string amenable to plotting If need be, the gluon parameter is appropriately encoded (e.g., log-exp transform).

Parameters:
  • gluon_param (Parameter) – gluon parameter

  • encoding (ScalarEncodingBase) – object in charge of encoding/decoding the gluon_param

Return type:

str

syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.gluon_blocks_helpers.register_parameter(params, name, encoding, shape=(1, ), dtype=<class 'numpy.float64'>)[source]
syne_tune.optimizer.schedulers.searchers.bayesopt.gpautograd.gluon_blocks_helpers.create_encoding(encoding_name, init_val, constr_lower, constr_upper, dimension, prior)[source]