Installation

To install Syne Tune from pip, you can simply do:

pip install 'syne-tune[basic]'

For development, you need to install Syne Tune from source:

git clone https://github.com/awslabs/syne-tune.git
cd syne-tune
python3 -m venv st_venv
. st_venv/bin/activate
pip install --upgrade pip
pip install -e '.[basic,dev]'

This installs Syne Tune in a virtual environment st_venv. Remember to activate this environment before working with Syne Tune. We also recommend building the virtual environment from scratch now and then, in particular when you pull a new release, as dependencies may have changed.

See our change log to check what has changed in the latest version.

In the examples above, Syne Tune is installed with the tag basic, which collects a reasonable number of dependencies. If you want to install all dependencies, replace basic with extra. You can further refine this selection by using partial dependencies.

What Is Hyperparameter Optimization?

Here is an introduction to hyperparameter optimization in the context of deep learning, which uses Syne Tune for some examples.

First Example

To enable tuning, you have to report metrics from a training script so that they can be communicated later to Syne Tune, this can be accomplished by just calling report(epoch=epoch, loss=loss), as shown in this example:

Once you have annotated your training script in this way, you can launch a tuning experiment as follows:

This example runs ASHA with n_workers=4 asynchronously parallel workers for max_wallclock_time=30 seconds on the local machine it is called on (trial_backend=LocalBackend(entry_point=entry_point)).

Supported HPO Methods

The following hyperparameter optimization (HPO) methods are available in Syne Tune:

Method

Reference

Searcher

Asynchronous?

Multi-fidelity?

Transfer?

Grid Search

deterministic

yes

no

no

Random Search

Bergstra, et al. (2011)

random

yes

no

no

Bayesian Optimization

Snoek, et al. (2012)

model-based

yes

no

no

BORE

Tiao, et al. (2021)

model-based

yes

no

no

MedianStoppingRule

Golovin, et al. (2017)

any

yes

yes

no

SyncHyperband

Li, et al. (2018)

random

no

yes

no

SyncBOHB

Falkner, et al. (2018)

model-based

no

yes

no

SyncMOBSTER

Klein, et al. (2020)

model-based

no

yes

no

ASHA

Li, et al. (2019)

random

yes

yes

no

BOHB

Falkner, et al. (2018)

model-based

yes

yes

no

MOBSTER

Klein, et al. (2020)

model-based

yes

yes

no

DEHB

Awad, et al. (2021)

evolutionary

no

yes

no

HyperTune

Li, et al. (2022)

model-based

yes

yes

no

DyHPO *

Wistuba, et al. (2022)

model-based

yes

yes

no

ASHABORE

Tiao, et al. (2021)

model-based

yes

yes

no

PASHA

Bohdal, et al. (2022)

random

yes

yes

no

REA

Real, et al. (2019)

evolutionary

yes

no

no

KDE

Falkner, et al. (2018)

model-based

yes

no

no

PopulationBasedTraining

Jaderberg, et al. (2017)

evolutionary

no

yes

no

ZeroShotTransfer

Wistuba, et al. (2015)

deterministic

yes

no

yes

ASHA-CTS (ASHACTS)

Salinas, et al. (2021)

random

yes

yes

yes

RUSH (RUSHScheduler)

Zappella, et al. (2021)

random

yes

yes

yes

BoundingBox

Perrone, et al. (2019)

any

yes

yes

yes

*: We implement the model-based scheduling logic of DyHPO, but use the same Gaussian process surrogate models as MOBSTER and HyperTune. The original source code for the paper is here.

The searchers fall into four broad categories, deterministic, random, evolutionary and model-based. The random searchers sample candidate hyperparameter configurations uniformly at random, while the model-based searchers sample them non-uniformly at random, according to a model (e.g., Gaussian process, density ration estimator, etc.) and an acquisition function. The evolutionary searchers make use of an evolutionary algorithm.

Syne Tune also supports BoTorch searchers, see BoTorch.

Supported Multi-objective Optimization Methods

Method

Reference

Searcher

Asynchronous?

Multi-fidelity?

Transfer?

ConstrainedBayesianOptimization

Gardner, et al. (2014)

model-based

yes

no

no

MOASHA

Schmucker, et al. (2021)

random

yes

yes

no

NSGA2

Deb, et al. (2002)

evolutionary

no

no

no

MORandomScalarizationBayesOpt

Peria, et al. (2018)

model-based

yes

no

no

MOLinearScalarizationBayesOpt

model-based

yes

no

no

HPO methods listed can be used in a multi-objective setting by scalarization (LinearScalarizationPriority) or non-dominated sorting (NonDominatedPriority).

Security

See CONTRIBUTING for more information.

Citing Syne Tune

If you use Syne Tune in a scientific publication, please cite the following paper:

Syne Tune: A Library for Large Scale Hyperparameter Tuning and Reproducible Research

@inproceedings{
    salinas2022syne,
    title = {{Syne Tune}: A Library for Large Scale Hyperparameter Tuning and Reproducible Research},
    author = {David Salinas and Matthias Seeger and Aaron Klein and Valerio Perrone and Martin Wistuba and Cedric Archambeau},
    booktitle = {International Conference on Automated Machine Learning, AutoML 2022},
    year = {2022},
    url = {https://proceedings.mlr.press/v188/salinas22a.html}
}

License

This project is licensed under the Apache-2.0 License.