This is a very interesting paper which presents ‘SHERPA’, a Python library for hyperparameter tuning of machine learning models. With Sherpa, researchers can easily optimize hyperparameters using a variety of powerful and interchangeable algorithms.
- Hyperparameter optimization for machine learning researchers.
- It can be used with any Python machine learning library such as Keras, Tensorflow, PyTorch, or Scikit-Learn.
- A choice of hyperparameter optimization algorithms such as Bayesian optimization via GPyOpt (example notebook), Asynchronous Successive Halving (aka Hyperband) (example notebook), and Population Based Training (example notebook).
- Parallel computation that can be fitted to the user’s needs.
- A live dashboard for the exploratory analysis of results.
Download Paper: https://arxiv.org/pdf/2005.04048.pdf