Home 9 Publications 9 Meta-hyperband: Hyperparameter Optimization with Meta-Learning and Coarse-to-Fine

Meta-hyperband: Hyperparameter Optimization with Meta-Learning and Coarse-to-Fine

Author: S. Payrosangari, A. Sadeghi, D. Graux, J. Lehmann
Journal: IDEAL 2020: Intelligent Data Engineering and Automated Learning
Year: 2020

Citation information

S. Payrosangari, A. Sadeghi, D. Graux, J. Lehmann,
IDEAL 2020: Intelligent Data Engineering and Automated Learning,
2020,
335–347,
https://link.springer.com/chapter/10.1007%2F978-3-030-62365-4_32

Hyperparameter optimization is one of the main pillars of machine learning algorithms. In this paper, we introduce Meta-Hyperband: a Hyperband based algorithm that improves the hyperparameter optimization by adding levels of exploitation. Unlike Hyperband method, which is a pure exploration bandit-based approach for hyperparameter optimization, our meta approach generates a trade-off between exploration and exploitation by combining the Hyperband method with meta-learning and Coarse-to-Fine modules. We analyze the performance of Meta-Hyperband on various datasets to tune the hyperparameters of CNN and SVM. The experiments indicate that in many cases Meta-Hyperband can discover hyperparameter configurations with higher quality than Hyperband, using similar amounts of resources. In particular, we discovered a CNN configuration for classifying CIFAR10 dataset which has a 3% higher performance than the configuration founded by Hyperband, and is also 0.3% more accurate than the best-reported configuration of the Bayesian optimization approach. Additionally, we release a publicly available pool of historically well-performed configurations on several datasets for CNN and SVM to ease the adoption of Meta-Hyperband.