syne_tune.optimizer.schedulers.searchers.cost_aware package

class syne_tune.optimizer.schedulers.searchers.cost_aware.CostAwareGPFIFOSearcher(config_space, metric, points_to_evaluate=None, **kwargs)[source]

Bases: MultiModelGPFIFOSearcher

Gaussian process-based cost-aware hyperparameter optimization (to be used with FIFOScheduler). The searcher requires a cost metric, which is given by cost_attr.

Implements two different variants. If resource_attr is given, cost values are read from each report and cost is modeled as \(c(x, r)\), the cost model being given by kwargs["cost_model"].

If resource_attr is not given, cost values are read only at the end (just like the primary metric) and cost is modeled as \(c(x)\), using a default GP surrogate model.

Note: The presence or absence of resource_attr decides on which variant is used here. If resource_attr is given, cost_model must be given as well.

Additional arguments on top of parent class GPFIFOSearcher:

Parameters:
  • cost_attr (str) – Mandatory. Name of cost attribute in data obtained from reporter (e.g., elapsed training time). Depending on whether resource_attr is given, cost values are read from each report or only at the end.

  • resource_attr (str, optional) – Name of resource attribute in reports, optional. If this is given, cost values are read from each report and cost is modeled as \(c(x, r)\), the cost model being given by cost_model. If not given, cost values are read only at the end (just like the primary metric) and cost is modeled as \(c(x)\), using a default GP surrogate model.

  • cost_model (CostModel, optional) – Needed if resource_attr is given, model for \(c(x, r)\). Ignored if resource_attr is not given, since \(c(x)\) is represented by a default GP surrogate model.

clone_from_state(state)[source]

Together with get_state(), this is needed in order to store and re-create the mutable state of the searcher.

Given state as returned by get_state(), this method combines the non-pickle-able part of the immutable state from self with state and returns the corresponding searcher clone. Afterwards, self is not used anymore.

Parameters:

state – See above

Returns:

New searcher object

class syne_tune.optimizer.schedulers.searchers.cost_aware.CostAwareGPMultiFidelitySearcher(config_space, metric, points_to_evaluate=None, **kwargs)[source]

Bases: MultiModelGPMultiFidelitySearcher

Gaussian process-based cost-aware multi-fidelity hyperparameter optimization (to be used with HyperbandScheduler). The searcher requires a cost metric, which is given by cost_attr.

The acquisition function used here is the same as in GPMultiFidelitySearcher, but expected improvement (EI) is replaced by EIpu (see EIpuAcquisitionFunction).

Cost values are read from each report and cost is modeled as \(c(x, r)\), the cost model being given by kwargs["cost_model"].

Additional arguments on top of parent class GPMultiFidelitySearcher:

Parameters:
  • cost_attr (str) – Mandatory. Name of cost attribute in data obtained from reporter (e.g., elapsed training time). Depending on whether resource_attr is given, cost values are read from each report or only at the end.

  • resource_attr (str) – Name of resource attribute in reports. Cost values are read from each report and cost is modeled as \(c(x, r)\), the cost model being given by cost_model.

  • cost_model (CostModel, optional) – Model for \(c(x, r)\)

clone_from_state(state)[source]

Together with get_state(), this is needed in order to store and re-create the mutable state of the searcher.

Given state as returned by get_state(), this method combines the non-pickle-able part of the immutable state from self with state and returns the corresponding searcher clone. Afterwards, self is not used anymore.

Parameters:

state – See above

Returns:

New searcher object

Submodules