HyperUCB: Hyperparameter Optimization Using Contextual Bandits

Setting the optimal hyperparameters of a learning algorithm is a crucial task. Common approaches such as a grid search over the hyperparameter space or randomly sampling hyperparameters require many configurations to be evaluated in order to perform well. Hence, they either yield suboptimal hyperparameter configurations or are expensive in terms of computational resources. As a remedy, Hyperband, an exploratory bandit-based algorithm, introduces an early-stopping strategy to quickly provide competitive configurations given a resource budget which often outperforms Bayesian optimization approaches. However, Hyperband keeps sampling iid configurations for assessment without taking previous evaluations into account. We propose HyperUCB, a UCB extension of Hyperband which assesses the sampled configurations and only evaluates promising samples. We compare our approach on MNIST data against Hyperband and show that we perform better in most cases.

  • Published in:
    ECML PKDD 2019: Machine Learning and Knowledge Discovery in Databases European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD)
  • Type:
    Inproceedings
  • Authors:
    M. Tavakol, S. Mair, K. Morik
  • Year:
    2019

Citation information

M. Tavakol, S. Mair, K. Morik: HyperUCB: Hyperparameter Optimization Using Contextual Bandits, European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD), ECML PKDD 2019: Machine Learning and Knowledge Discovery in Databases, 2019, https://doi.org/10.1007/978-3-030-43823-4_4, Tavakol.etal.2019,