Meta-hyperband: Hyperparameter Optimization with Meta-Learning and Coarse-to-Fine

Hyperparameter optimization is one of the main pillars of machine learning algorithms. In this paper, we introduce Meta-Hyperband: a Hyperband based algorithm that improves the hyperparameter optimization by adding levels of exploitation. Unlike Hyperband method, which is a pure exploration bandit-based approach for hyperparameter optimization, our meta approach generates a trade-off between exploration and exploitation by combining the Hyperband method with meta-learning and Coarse-to-Fine modules. We analyze the performance of Meta-Hyperband on various datasets to tune the hyperparameters of CNN and SVM. The experiments indicate that in many cases Meta-Hyperband can discover hyperparameter configurations with higher quality than Hyperband, using similar amounts of resources. In particular, we discovered a CNN configuration for classifying CIFAR10 dataset which has a 3% higher performance than the configuration founded by Hyperband, and is also 0.3% more accurate than the best-reported configuration of the Bayesian optimization approach. Additionally, we release a publicly available pool of historically well-performed configurations on several datasets for CNN and SVM to ease the adoption of Meta-Hyperband.

  • Published in:
    IDEAL 2020: Intelligent Data Engineering and Automated Learning International Conference on Intelligent Data Engineering and Automated Learning (IDEAL)
  • Type:
    Inproceedings
  • Authors:
    S. Payrosangari, A. Sadeghi, D. Graux, J. Lehmann
  • Year:
    2020

Citation information

S. Payrosangari, A. Sadeghi, D. Graux, J. Lehmann: Meta-hyperband: Hyperparameter Optimization with Meta-Learning and Coarse-to-Fine, International Conference on Intelligent Data Engineering and Automated Learning (IDEAL), IDEAL 2020: Intelligent Data Engineering and Automated Learning, 2020, https://doi.org/10.1007/978-3-030-62365-4_32, Payrosangari.etal.2020,