syne_tune.optimizer.schedulers.searchers.utils.hp_ranges_impl module
- class syne_tune.optimizer.schedulers.searchers.utils.hp_ranges_impl.HyperparameterRange(name)[source]
Bases:
object
- property name: str
- syne_tune.optimizer.schedulers.searchers.utils.hp_ranges_impl.scale_from_zero_one(value, lower_bound, upper_bound, scaling, lower_internal, upper_internal)[source]
- class syne_tune.optimizer.schedulers.searchers.utils.hp_ranges_impl.HyperparameterRangeContinuous(name, lower_bound, upper_bound, scaling, active_lower_bound=None, active_upper_bound=None)[source]
Bases:
HyperparameterRange
Real valued hyperparameter. If
active_lower_bound
and/oractive_upper_bound
are given, the feasible interval for values of new configs is reduced, but data can still contain configs with values in[lower_bound, upper_bound]
, and internal encoding is done w.r.t. this original range.- Parameters:
name (
str
) – Name of hyperparameterlower_bound (
float
) – Lower bound (included)upper_bound (
float
) – Upper bound (included)scaling (
Scaling
) – Determines internal representation, wherebyparameter = scaling(internal)
.active_lower_bound (
Optional
[float
]) – See aboveactive_upper_bound (
Optional
[float
]) – See above
- class syne_tune.optimizer.schedulers.searchers.utils.hp_ranges_impl.HyperparameterRangeInteger(name, lower_bound, upper_bound, scaling, active_lower_bound=None, active_upper_bound=None)[source]
Bases:
HyperparameterRange
Integer valued hyperparameter. Both bounds are included in the valid values. Under the hood generates a continuous range from
lower_bound - 0.5
toupper_bound + 0.5
. See docs for continuous hyperparameter for more information.- Parameters:
name (
str
) – Name of hyperparameterlower_bound (
int
) – Lower bound (integer, included)upper_bound (
int
) – Upper bound (integer, included)scaling (
Scaling
) – Determines internal representation, wherebyparameter = scaling(internal)
.active_lower_bound (
Optional
[int
]) – See aboveactive_upper_bound (
Optional
[int
]) – See above
- class syne_tune.optimizer.schedulers.searchers.utils.hp_ranges_impl.HyperparameterRangeFiniteRange(name, lower_bound, upper_bound, size, scaling, cast_int=False)[source]
Bases:
HyperparameterRange
Finite range numerical hyperparameter, see
FiniteRange
. Internally, we use anint
with linear scaling.Note: Different to
HyperparameterRangeContinuous
, we require thatlower_bound < upper_bound
andsize >=2
.- Parameters:
name (
str
) – Name of hyperparameterlower_bound (
float
) – Lower bound (included)upper_bound (
float
) – Upper bound (included)size (
int
) – Number of values in rangescaling (
Scaling
) – Determines internal representation, wherebyparameter = scaling(internal)
.cast_int (
bool
) – If True, values are cast toint
- class syne_tune.optimizer.schedulers.searchers.utils.hp_ranges_impl.HyperparameterRangeCategorical(name, choices)[source]
Bases:
HyperparameterRange
Base class for categorical hyperparameter.
- Parameters:
name (
str
) – Name of hyperparameterchoices (
Tuple
[Any
,...
]) – Values parameter can take
- class syne_tune.optimizer.schedulers.searchers.utils.hp_ranges_impl.HyperparameterRangeCategoricalNonBinary(name, choices, active_choices=None)[source]
Bases:
HyperparameterRangeCategorical
Can take on discrete set of values. We use one-hot encoding internally. If the value range has size 2, it is more efficient to use
HyperparameterRangeCategoricalBinary
.- Parameters:
name (
str
) – Name of hyperparameterchoices (
Tuple
[Any
,...
]) – Values parameter can takeactive_choices (
Optional
[Tuple
[Any
,...
]]) – If given, must be nonempty subset ofchoices
.
- class syne_tune.optimizer.schedulers.searchers.utils.hp_ranges_impl.HyperparameterRangeCategoricalBinary(name, choices, active_choices=None)[source]
Bases:
HyperparameterRangeCategorical
Here, the value range must be of size 2. The internal encoding is a single int, so 1 instead of 2 dimensions.
- Parameters:
name (
str
) – Name of hyperparameterchoices (
Tuple
[Any
,...
]) – Values parameter can take (must be size 2)active_choices (
Optional
[Tuple
[Any
,...
]]) – If given, must be nonempty subset ofchoices
.
- class syne_tune.optimizer.schedulers.searchers.utils.hp_ranges_impl.HyperparameterRangeOrdinalEqual(name, choices, active_choices=None)[source]
Bases:
HyperparameterRangeCategorical
Ordinal hyperparameter, equal distance encoding. See also
Ordinal
.- Parameters:
name (
str
) – Name of hyperparameterchoices (
Tuple
[Any
,...
]) – Values parameter can takeactive_choices (
Optional
[Tuple
[Any
,...
]]) – If given, must be nonempty contiguous subsequence ofchoices
.
- class syne_tune.optimizer.schedulers.searchers.utils.hp_ranges_impl.HyperparameterRangeOrdinalNearestNeighbor(name, choices, log_scale=False, active_choices=None)[source]
Bases:
HyperparameterRangeCategorical
Ordinal hyperparameter, nearest neighbour encoding. See also
OrdinalNearestNeighbor
.- Parameters:
name (
str
) – Name of hyperparameterchoices (
Tuple
[Any
,...
]) – Values parameter can take (numerical values, strictly increasing, size>= 2
)log_scale (
bool
) – IfTrue
, nearest neighbour done in log (choices
must be positive)active_choices (
Optional
[Tuple
[Any
,...
]]) – If given, must be nonempty contiguous subsequence ofchoices
.
- property log_scale: bool
- class syne_tune.optimizer.schedulers.searchers.utils.hp_ranges_impl.HyperparameterRangesImpl(config_space, name_last_pos=None, value_for_last_pos=None, active_config_space=None, prefix_keys=None)[source]
Bases:
HyperparameterRanges
Basic implementation of
HyperparameterRanges
.- Parameters:
config_space (
Dict
[str
,Any
]) – Configuration spacename_last_pos (
Optional
[str
]) – SeeHyperparameterRanges
, optionalvalue_for_last_pos – See
HyperparameterRanges
, optionalactive_config_space (
Optional
[Dict
[str
,Any
]]) – SeeHyperparameterRanges
, optionalprefix_keys (
Optional
[List
[str
]]) – SeeHyperparameterRanges
, optional
- property ndarray_size: int
- Returns:
Dimensionality of encoded vector returned by
to_ndarray
- to_ndarray(config)[source]
Map configuration to
[0, 1]
encoded vector- Parameters:
config (
Dict
[str
,Union
[int
,float
,str
]]) – Configuration to encode- Return type:
ndarray
- Returns:
Encoded vector
- from_ndarray(enc_config)[source]
Maps encoded vector back to configuration (can involve rounding)
The encoded vector
enc_config
need to be in the image ofto_ndarray
. In fact, any[0, 1]
valued vector of dimensionalityndarray_size
is allowed.- Parameters:
enc_config (
ndarray
) – Encoded vector- Return type:
Dict
[str
,Union
[int
,float
,str
]]- Returns:
Configuration corresponding to encoded vector
- property encoded_ranges: Dict[str, Tuple[int, int]]
Encoded ranges are
[0, 1]
or closed subintervals thereof, in caseactive_config_space
is used.- Returns:
Ranges of hyperparameters in the encoded ndarray representation
- syne_tune.optimizer.schedulers.searchers.utils.hp_ranges_impl.decode_extended_features(features_ext, resource_attr_range)[source]
Given matrix of features from extended configs, corresponding to
ExtendedConfiguration
, split into feature matrix from normal configs and resource values.- Parameters:
features_ext (
ndarray
) – Matrix of features from extended configsresource_attr_range (
Tuple
[int
,int
]) –(r_min, r_max)
- Return type:
(
ndarray
,ndarray
)- Returns:
(features, resources)