The Price of Hierarchical Clustering

Hierarchical Clustering is a popular tool for understanding the hereditary properties of a data set. Such a clustering is actually a sequence of clusterings that starts with the trivial clustering in which every data point forms its own cluster and then successively merges two existing clusters until all points are in the same cluster. A hierarchical clustering achieves an approximation factor of $alpha$ if the costs of each k-clustering in the hierarchy are at most $alpha$ times the costs of an optimal k-clustering. We study as cost functions the maximum (discrete) radius of any cluster (k-center problem) and the maximum diameter of any cluster (k-diameter problem). In general, the optimal clusterings do not form a hierarchy and hence an approximation factor of 1 cannot be achieved. We call the smallest approximation factor that can be achieved for any instance the price of hierarchy. For the k-diameter problem we improve the upper bound on the price of hierarchy to $3+2sqrt{2}-approx 5.83$. Moreover we significantly improve the lower bounds for k-center and k-diameter, proving a price of hierarchy of exactly 4 and $3+2sqrt{2}$, respectively.

  • Published in:
    European Symposia on Algorithms
  • Type:
  • Authors:
    Arutyunova, Anna; Röglin, Heiko
  • Year:

Citation information

Arutyunova, Anna; Röglin, Heiko: The Price of Hierarchical Clustering, European Symposia on Algorithms, 2022,, Arutyunova.Roeglin.2022a,

Associated Lamarr Researchers

lamarr institute person Roeglin Heiko - Lamarr Institute for Machine Learning (ML) and Artificial Intelligence (AI)

Prof. Dr. Heiko Röglin

Principal Investigator Resource-aware ML to the profile