Frugal Optimization for Cost-related Hyperparameters
- Qingyun Wu ,
- Chi Wang ,
- Silu Huang
The Thirty-Fifth AAAI Conference on Artificial Intelligence (AAAI-21) |
The increasing demand for democratizing machine learning algorithms calls for hyperparameter optimization (HPO) solutions at low cost. Many machine learning algorithms have hyperparameters which can cause a large variation in the training cost. But this effect is largely ignored in existing HPO methods, which are incapable to properly control cost during the optimization process. To address this problem, we develop a new cost-frugal HPO solution. The core of our solution is a simple but new randomized direct-search method, for which we prove a convergence rate of \(O(\frac{\sqrt{d}}{\sqrt{K}})\) and an \(O(d\epsilon^{-2})\)-approximation guarantee on the total cost. We provide strong empirical results in comparison with state-of-the-art HPO methods on large AutoML benchmarks.
Publication Downloads
FLAML: A Fast Library for AutoML and Tuning
December 15, 2020
FLAML is a Python library designed to automatically produce accurate machine learning models with low computational cost. It frees users from selecting learners and hyperparameters for each learner. FLAML is powered by a new, cost-effective hyperparameter optimization and learner selection method invented by Microsoft Research.