syne_tune.optimizer.schedulers.searchers.utils package

class syne_tune.optimizer.schedulers.searchers.utils.HyperparameterRanges(config_space, name_last_pos=None, value_for_last_pos=None, active_config_space=None, prefix_keys=None)[source]

Bases: object

Wraps configuration space, provides services around encoding of hyperparameters (mapping configurations to [0, 1] vectors and vice versa).

If name_last_pos is given, the hyperparameter of that name is assigned the final position in the vector returned by to_ndarray(). This can be used to single out the (time) resource for a GP model, where that component has to come last.

If in this case (name_last_pos given), value_for_last_pos is also given, some methods are modified:

  • random_config() samples a config as normal, but then overwrites the name_last_pos component by value_for_last_pos

  • get_ndarray_bounds() works as normal, but returns bound (a, a) for name_last_pos component, where a is the internal value corresponding to value_for_last_pos

The use case is HPO with a resource attribute. This attribute should be fixed when optimizing the acquisition function, but can take different values in the evaluation data (coming from all previous searches).

If active_config_space is given, it contains a subset of non-constant hyperparameters in config_space, and the range of each entry is a subset of the range of the corresponding config_space entry. These active ranges affect the choice of new configs (by sampling). While the internal encoding is based on original ranges, search is restricted to active ranges (e.g., optimization of surrogate model). This option is required to implement transfer tuning, where domain ranges in config_space may be narrower than what data from past tuning jobs requires.

Parameters:
  • config_space (Dict[str, Any]) – Configuration space. Constant hyperparameters are filtered out here

  • name_last_pos (Optional[str]) – See above, optional

  • value_for_last_pos – See above, optional

  • active_config_space (Optional[dict]) – See above, optional

  • prefix_keys (Optional[List[str]]) – If given, these keys into config_space come first in the internal ordering, which determines the internal encoding. Optional

property internal_keys: List[str]
property config_space_for_sampling: Dict[str, Any]
to_ndarray(config)[source]

Map configuration to [0, 1] encoded vector

Parameters:

config (Dict[str, Union[int, float, str]]) – Configuration to encode

Return type:

ndarray

Returns:

Encoded vector

to_ndarray_matrix(configs)[source]

Map configurations to [0, 1] encoded matrix

Parameters:

configs (Iterable[Dict[str, Union[int, float, str]]]) – Configurations to encode

Return type:

ndarray

Returns:

Matrix of encoded vectors (rows)

property ndarray_size: int
Returns:

Dimensionality of encoded vector returned by to_ndarray

from_ndarray(enc_config)[source]

Maps encoded vector back to configuration (can involve rounding)

The encoded vector enc_config need to be in the image of to_ndarray. In fact, any [0, 1] valued vector of dimensionality ndarray_size is allowed.

Parameters:

enc_config (ndarray) – Encoded vector

Return type:

Dict[str, Union[int, float, str]]

Returns:

Configuration corresponding to encoded vector

property encoded_ranges: Dict[str, Tuple[int, int]]

Encoded ranges are [0, 1] or closed subintervals thereof, in case active_config_space is used.

Returns:

Ranges of hyperparameters in the encoded ndarray representation

is_attribute_fixed()[source]
Returns:

Is last position attribute fixed?

random_config(random_state)[source]

Draws random configuration

Parameters:

random_state (RandomState) – Random state

Return type:

Dict[str, Union[int, float, str]]

Returns:

Random configuration

random_configs(random_state, num_configs)[source]

Draws random configurations

Parameters:
  • random_state – Random state

  • num_configs (int) – Number of configurations to sample

Return type:

List[Dict[str, Union[int, float, str]]]

Returns:

Random configurations

get_ndarray_bounds()[source]
Return type:

List[Tuple[float, float]]

Returns:

List of (lower, upper) bounds for each dimension in encoded vector representation.

filter_for_last_pos_value(configs)[source]

If is_attribute_fixed, configs is filtered by removing entries whose name_last_pos attribute value is different from value_for_last_pos. Otherwise, it is returned unchanged.

Parameters:

configs (List[Dict[str, Union[int, float, str]]]) – List of configs to be filtered

Return type:

List[Dict[str, Union[int, float, str]]]

Returns:

Filtered list of configs

config_to_tuple(config, keys=None, skip_last=False)[source]
Parameters:
  • config (Dict[str, Union[int, float, str]]) – Configuration

  • keys (Optional[List[str]]) – Overrides _internal_keys

  • skip_last (bool) – If True and name_last_pos is used, the corresponding attribute is skipped, so that config and tuple are non-extended

Return type:

Tuple[Union[str, int, float], ...]

Returns:

Tuple representation

tuple_to_config(config_tpl, keys=None, skip_last=False)[source]

Reverse of config_to_tuple().

Parameters:
  • config_tpl (Tuple[Union[str, int, float], ...]) – Tuple representation

  • keys (Optional[List[str]]) – Overrides _internal_keys

  • skip_last (bool) – If True and name_last_pos is used, the corresponding attribute is skipped, so that config and tuple are non-extended

Return type:

Dict[str, Union[int, float, str]]

Returns:

Configuration corresponding to config_tpl

config_to_match_string(config, keys=None, skip_last=False)[source]

Maps configuration to match string, used to compare for approximate equality. Two configurations are considered to be different if their match strings are not the same.

Parameters:
  • config (Dict[str, Union[int, float, str]]) – Configuration

  • keys (Optional[List[str]]) – Overrides _internal_keys

  • skip_last (bool) – If True and name_last_pos is used, the corresponding attribute is skipped, so that config and match string are non-extended

Return type:

str

Returns:

Match string

syne_tune.optimizer.schedulers.searchers.utils.make_hyperparameter_ranges(config_space, name_last_pos=None, value_for_last_pos=None, active_config_space=None, prefix_keys=None)[source]

Default method to create HyperparameterRanges from config_space

Parameters:
Return type:

HyperparameterRanges

Returns:

New object

class syne_tune.optimizer.schedulers.searchers.utils.HyperparameterRangesImpl(config_space, name_last_pos=None, value_for_last_pos=None, active_config_space=None, prefix_keys=None)[source]

Bases: HyperparameterRanges

Basic implementation of HyperparameterRanges.

Parameters:
property ndarray_size: int
Returns:

Dimensionality of encoded vector returned by to_ndarray

to_ndarray(config)[source]

Map configuration to [0, 1] encoded vector

Parameters:

config (Dict[str, Union[int, float, str]]) – Configuration to encode

Return type:

ndarray

Returns:

Encoded vector

from_ndarray(enc_config)[source]

Maps encoded vector back to configuration (can involve rounding)

The encoded vector enc_config need to be in the image of to_ndarray. In fact, any [0, 1] valued vector of dimensionality ndarray_size is allowed.

Parameters:

enc_config (ndarray) – Encoded vector

Return type:

Dict[str, Union[int, float, str]]

Returns:

Configuration corresponding to encoded vector

property encoded_ranges: Dict[str, Tuple[int, int]]

Encoded ranges are [0, 1] or closed subintervals thereof, in case active_config_space is used.

Returns:

Ranges of hyperparameters in the encoded ndarray representation

get_ndarray_bounds()[source]
Return type:

List[Tuple[float, float]]

Returns:

List of (lower, upper) bounds for each dimension in encoded vector representation.

class syne_tune.optimizer.schedulers.searchers.utils.LinearScaling[source]

Bases: Scaling

to_internal(value)[source]
Return type:

float

from_internal(value)[source]
Return type:

float

class syne_tune.optimizer.schedulers.searchers.utils.LogScaling[source]

Bases: Scaling

to_internal(value)[source]
Return type:

float

from_internal(value)[source]
Return type:

float

class syne_tune.optimizer.schedulers.searchers.utils.ReverseLogScaling[source]

Bases: Scaling

to_internal(value)[source]
Return type:

float

from_internal(value)[source]
Return type:

float

syne_tune.optimizer.schedulers.searchers.utils.get_scaling(hp_range)[source]
Return type:

Scaling

Submodules