syne_tune.optimizer.schedulers.searchers.regularized_evolution module
- class syne_tune.optimizer.schedulers.searchers.regularized_evolution.PopulationElement(score=0, config=None, results=None)[source]
Bases:
object
-
score:
float
= 0
-
config:
Dict
[str
,Any
] = None
-
results:
List
[float
] = None
-
score:
- syne_tune.optimizer.schedulers.searchers.regularized_evolution.mutate_config(config, config_space, rng, num_try=1000)[source]
- Return type:
Dict
[str
,Any
]
- syne_tune.optimizer.schedulers.searchers.regularized_evolution.sample_random_config(config_space, rng)[source]
- Return type:
Dict
[str
,Any
]
- class syne_tune.optimizer.schedulers.searchers.regularized_evolution.RegularizedEvolution(config_space, points_to_evaluate=None, population_size=100, sample_size=10, random_seed=None)[source]
Bases:
SingleObjectiveBaseSearcher
Implements the regularized evolution algorithm. The original implementation only considers categorical hyperparameters. For integer and float parameters we sample a new value uniformly at random. Reference:
Real, E., Aggarwal, A., Huang, Y., and Le, Q. V.Regularized Evolution for Image Classifier Architecture Search.In Proceedings of the Conference on Artificial Intelligence (AAAI’19)The code is based one the original regularized evolution open-source implementation: https://colab.research.google.com/github/google-research/google-research/blob/master/evolution/regularized_evolution_algorithm/regularized_evolution.ipynb
Additional arguments on top of parent class
StochasticSearcher
:- Parameters:
population_size (
int
) – Size of the population, defaults to 100sample_size (
int
) – Size of the candidate set to obtain a parent for the mutation, defaults to 10
- suggest(**kwargs)[source]
Suggest a new configuration.
Note: Query
_next_points_to_evaluate()
for initial configs to return first.- Parameters:
kwargs – Extra information may be passed from scheduler to searcher
- Return type:
Optional
[dict
]- Returns:
New configuration. The searcher may return None if a new configuration cannot be suggested. In this case, the tuning will stop. This happens if searchers never suggest the same config more than once, and all configs in the (finite) search space are exhausted.
- on_trial_complete(trial_id, config, metric)[source]
Inform searcher about result
The scheduler passes every result. If
update == True
, the searcher should update its surrogate model (if any), otherwiseresult
is an intermediate result not modelled.The default implementation calls
_update()
ifupdate == True
. It can be overwritten by searchers which also react to intermediate results.- Parameters:
trial_id (
int
) – Seeon_trial_result()
config (
Dict
[str
,Any
]) – Seeon_trial_result()
metric (
float
) – Seeon_trial_result()