syne_tune.optimizer.schedulers.multiobjective.multi_objective_regularized_evolution module

class syne_tune.optimizer.schedulers.multiobjective.multi_objective_regularized_evolution.MultiObjectiveRegularizedEvolution(config_space, points_to_evaluate=None, population_size=100, sample_size=10, multiobjective_priority=None, random_seed=None)[source]

Bases: BaseSearcher

Adapts regularized evolution algorithm by Real et al. to the multi-objective setting. Elements in the populations are scored via a multi-objective priority that is set to non-dominated sort by default. Parents are sampled from the population based on this score.

Additional arguments on top of parent class syne_tune.optimizer.schedulers.searchers.StochasticSearcher:

Parameters:
  • population_size (int) – Size of the population, defaults to 100

  • sample_size (int) – Size of the candidate set to obtain a parent for the mutation, defaults to 10

suggest(**kwargs)[source]

Suggest a new configuration.

Note: Query _next_points_to_evaluate() for initial configs to return first.

Parameters:

kwargs – Extra information may be passed from scheduler to searcher

Return type:

Optional[dict]

Returns:

New configuration. The searcher may return None if a new configuration cannot be suggested. In this case, the tuning will stop. This happens if searchers never suggest the same config more than once, and all configs in the (finite) search space are exhausted.

on_trial_complete(trial_id, config, metrics)[source]

Inform searcher about result

The scheduler passes every result. If update == True, the searcher should update its surrogate model (if any), otherwise result is an intermediate result not modelled.

The default implementation calls _update() if update == True. It can be overwritten by searchers which also react to intermediate results.

Parameters:
  • trial_id (int) – See on_trial_result()

  • config (Dict[str, Any]) – See on_trial_result()

  • metrics (List[float]) – See on_trial_result()