Installation
To install Syne Tune from pip, you can simply do:
pip install 'syne-tune[basic]'
For development, you need to install Syne Tune from source:
git clone https://github.com/awslabs/syne-tune.git
cd syne-tune
python3 -m venv st_venv
. st_venv/bin/activate
pip install --upgrade pip
pip install -e '.[basic,dev]'
This installs Syne Tune in a virtual environment st_venv
. Remember to activate
this environment before working with Syne Tune. We also recommend building the
virtual environment from scratch now and then, in particular when you pull a new
release, as dependencies may have changed.
See our change log to check what has changed in the latest version.
In the examples above, Syne Tune is installed with the tag basic
, which
collects a reasonable number of dependencies. If you want to install all
dependencies, replace basic
with extra
. You can further refine this
selection by using
partial dependencies.
What Is Hyperparameter Optimization?
Here is an introduction to hyperparameter optimization in the context of deep learning, which uses Syne Tune for some examples.
First Example
To enable tuning, you have to report metrics from a training script so that they
can be communicated later to Syne Tune, this can be accomplished by just
calling report(epoch=epoch, loss=loss)
, as shown in this example:
Once you have annotated your training script in this way, you can launch a tuning experiment as follows:
This example runs ASHA with
n_workers=4
asynchronously parallel workers for max_wallclock_time=30
seconds on the local machine it is called on
(trial_backend=LocalBackend(entry_point=entry_point)
).
Supported HPO Methods
The following hyperparameter optimization (HPO) methods are available in Syne Tune:
Method |
Reference |
Searcher |
Asynchronous? |
Multi-fidelity? |
Transfer? |
---|---|---|---|---|---|
deterministic |
yes |
no |
no |
||
random |
yes |
no |
no |
||
model-based |
yes |
no |
no |
||
model-based |
yes |
no |
no |
||
any |
yes |
yes |
no |
||
random |
no |
yes |
no |
||
model-based |
no |
yes |
no |
||
model-based |
no |
yes |
no |
||
random |
yes |
yes |
no |
||
model-based |
yes |
yes |
no |
||
model-based |
yes |
yes |
no |
||
evolutionary |
no |
yes |
no |
||
model-based |
yes |
yes |
no |
||
DyHPO * |
model-based |
yes |
yes |
no |
|
model-based |
yes |
yes |
no |
||
random |
yes |
yes |
no |
||
evolutionary |
yes |
no |
no |
||
model-based |
yes |
no |
no |
||
evolutionary |
no |
yes |
no |
||
deterministic |
yes |
no |
yes |
||
ASHA-CTS ( |
random |
yes |
yes |
yes |
|
RUSH ( |
random |
yes |
yes |
yes |
|
any |
yes |
yes |
yes |
*: We implement the model-based scheduling logic of DyHPO, but use the same Gaussian process surrogate models as MOBSTER and HyperTune. The original source code for the paper is here.
The searchers fall into four broad categories, deterministic, random, evolutionary and model-based. The random searchers sample candidate hyperparameter configurations uniformly at random, while the model-based searchers sample them non-uniformly at random, according to a model (e.g., Gaussian process, density ration estimator, etc.) and an acquisition function. The evolutionary searchers make use of an evolutionary algorithm.
Syne Tune also supports BoTorch searchers,
see BoTorch
.
Supported Multi-objective Optimization Methods
Method |
Reference |
Searcher |
Asynchronous? |
Multi-fidelity? |
Transfer? |
---|---|---|---|---|---|
model-based |
yes |
no |
no |
||
random |
yes |
yes |
no |
||
evolutionary |
no |
no |
no |
||
model-based |
yes |
no |
no |
||
model-based |
yes |
no |
no |
HPO methods listed can be used in a multi-objective setting by scalarization
(LinearScalarizationPriority
)
or non-dominated sorting
(NonDominatedPriority
).
Security
See CONTRIBUTING for more information.
Citing Syne Tune
If you use Syne Tune in a scientific publication, please cite the following paper:
Syne Tune: A Library for Large Scale Hyperparameter Tuning and Reproducible Research
@inproceedings{
salinas2022syne,
title = {{Syne Tune}: A Library for Large Scale Hyperparameter Tuning and Reproducible Research},
author = {David Salinas and Matthias Seeger and Aaron Klein and Valerio Perrone and Martin Wistuba and Cedric Archambeau},
booktitle = {International Conference on Automated Machine Learning, AutoML 2022},
year = {2022},
url = {https://proceedings.mlr.press/v188/salinas22a.html}
}
License
This project is licensed under the Apache-2.0 License.