Towards Structural Hyperparameter Search in Kernel Minimum Enclosing Balls

In this paper we attempt to provide a structural methodology for an informed hyper-parameter search when fitting Kernel Minimum Enclosing Balls (KMEBs) to datasets. Our approach allows us to control the number of resulting support vectors, which can be an important aspect for practical applications. To address this problem, we particularly focus on searching the width of Gaussian kernel and introduce two methods that are based on Greedy Exponential Search (GES) and Divide and Conquer (DaC) approaches. Both algorithms in case of non-convergence return the approximate result for the width value corresponding to the closest bound for the number of support vectors. We evaluate our method on standard benchmark datasets for prototype extraction using a Frank-Wolfe algorithm to fit the balls and conclude distance choices that yield descriptive results. Moreover, we compare the number of execution of the fitting algorithm and the number of iterations it took for our methods to result in convergence.

  • Published in:
    Learning and Intelligent Optimization (LION)
  • Type:
    Inproceedings
  • Authors:
    H. Kondratiuk, R. Sifa
  • Year:
    2021

Citation information

H. Kondratiuk, R. Sifa: Towards Structural Hyperparameter Search in Kernel Minimum Enclosing Balls, Learning and Intelligent Optimization (LION), 2021, https://doi.org/10.1007/978-3-030-92121-7_14, Kondratiuk.Sifa.2021a,