Economical Hyperparameter Optimization with Blended Search Strategy
- Chi Wang ,
- Qingyun Wu ,
- Silu Huang ,
- Amin Saied
The Ninth International Conference on Learning Representations (ICLR 2021) |
We study the problem of using low cost to search for hyperparameter configurations in a large search space with heterogeneous evaluation cost and model quality. We propose a blended search strategy to combine the strengths of global and local search, and prioritize them on the fly with the goal of minimizing the total cost spent in finding good configurations. Our approach demonstrates robust performance for tuning both tree-based models and deep neural networks on a large AutoML benchmark, as well as superior performance in model quality, time, and resource consumption for a production transformer-based NLP model fine-tuning task.
论文与出版物下载
FLAML: A Fast Library for AutoML and Tuning
15 12 月, 2020
FLAML is a Python library designed to automatically produce accurate machine learning models with low computational cost. It frees users from selecting learners and hyperparameters for each learner. FLAML is powered by a new, cost-effective hyperparameter optimization and learner selection method invented by Microsoft Research.