Computing Divergences between Discrete Decomposable Models

There are many applications that benefit from computing the exact divergence between 2 discrete probability measures, including machine learning. Unfortunately, in the absence of any assumptions on the structure or independencies within these distributions, computing the divergence between them is an intractable problem in high dimensions. We show that we are able to compute a wide family of functionals and divergences, such as the alpha-beta divergence, between two decomposable models, i.e. chordal Markov networks, in time exponential to the treewidth of these models. The alpha-beta divergence is a family of divergences that include popular divergences such as the Kullback-Leibler divergence, the Hellinger distance, and the chi-squared divergence. Thus, we can accurately compute the exact values of any of this broad class of divergences to the extent to which we can accurately model the two distributions using decomposable models.

  • Published in:
    AAAI Conference on Artificial Intelligence
  • Type:
    Inproceedings
  • Authors:
    Lee, Loong Kuan; Piatkowski, Nico; Petitjean, Francois; Webb, Geoffrey I.
  • Year:
    2023

Citation information

Lee, Loong Kuan; Piatkowski, Nico; Petitjean, Francois; Webb, Geoffrey I.: Computing Divergences between Discrete Decomposable Models, AAAI Conference on Artificial Intelligence, 2023, https://arxiv.org/abs/2112.04583, Lee.etal.2023a,

Associated Lamarr Researchers

lamarr institute person Piatkowski Nico - Lamarr Institute for Machine Learning (ML) and Artificial Intelligence (AI)

Dr. Nico Piatkowski

Autor to the profile