syne_tune.optimizer.schedulers.searchers.regularized_evolution module

class syne_tune.optimizer.schedulers.searchers.regularized_evolution.PopulationElement(result=None, score=0, config=None)[source]

Bases: object

result: Dict[str, Any] = None
score: int = 0
config: Dict[str, Any] = None
class syne_tune.optimizer.schedulers.searchers.regularized_evolution.RegularizedEvolution(config_space, metric, points_to_evaluate=None, population_size=100, sample_size=10, **kwargs)[source]

Bases: StochasticSearcher

Implements the regularized evolution algorithm. The original implementation only considers categorical hyperparameters. For integer and float parameters we sample a new value uniformly at random. Reference:

Real, E., Aggarwal, A., Huang, Y., and Le, Q. V.
Regularized Evolution for Image Classifier Architecture Search.
In Proceedings of the Conference on Artificial Intelligence (AAAI’19)

The code is based one the original regularized evolution open-source implementation: https://colab.research.google.com/github/google-research/google-research/blob/master/evolution/regularized_evolution_algorithm/regularized_evolution.ipynb

Additional arguments on top of parent class StochasticSearcher:

Parameters:
  • mode – Mode to use for the metric given, can be “min” or “max”, defaults to “min”

  • population_size (int) – Size of the population, defaults to 100

  • sample_size (int) – Size of the candidate set to obtain a parent for the mutation, defaults to 10

get_config(**kwargs)[source]

Suggest a new configuration.

Note: Query _next_initial_config() for initial configs to return first.

Parameters:

kwargs – Extra information may be passed from scheduler to searcher

Return type:

Optional[dict]

Returns:

New configuration. The searcher may return None if a new configuration cannot be suggested. In this case, the tuning will stop. This happens if searchers never suggest the same config more than once, and all configs in the (finite) search space are exhausted.

configure_scheduler(scheduler)[source]

Some searchers need to obtain information from the scheduler they are used with, in order to configure themselves. This method has to be called before the searcher can be used.

Parameters:

scheduler (TrialScheduler) – Scheduler the searcher is used with.

clone_from_state(state)[source]

Together with get_state(), this is needed in order to store and re-create the mutable state of the searcher.

Given state as returned by get_state(), this method combines the non-pickle-able part of the immutable state from self with state and returns the corresponding searcher clone. Afterwards, self is not used anymore.

Parameters:

state (Dict[str, Any]) – See above

Returns:

New searcher object