Distributed Generative Modelling with Sub-Linear Communication Overhead

Pushing machine learning towards the edge, often implies the restriction to ultra-low-power (ULP) devices with rather limited compute capabilities. Generative models estimate the data generating probability mass P∗ which can in turn be used for various tasks, including simulation, prediction/forecasting, and novelty detection. Whenever the actual learning task is unknown at learning time or the task is allowed to change over time, learning a generative model is the only viable option. However, learning such models on resource constrained systems raises several challenges. Recent advances in exponential family learning allow us to estimate sophisticated models on highly resource-constrained systems. Nevertheless, the setting in which the training data is distributed among several devices in a network with presumably high communication costs has not yet been investigated. We close this gap by deriving and exploiting a new property of integer models. More precisely, we present a model averaging scheme whose communication complexity is sub-linear w.r.t. the parameter dimension d, and provide an upper bound on the global loss. Experimental results on benchmark data show, that the aggregated model is often on par with the non-distributed global model.

  • Published in:
    ECML PKDD 2019: Machine Learning and Knowledge Discovery in Databases European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD)
  • Type:
    Inproceedings
  • Authors:
    N. Piatkowski
  • Year:
    2019

Citation information

N. Piatkowski: Distributed Generative Modelling with Sub-Linear Communication Overhead, European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD), ECML PKDD 2019: Machine Learning and Knowledge Discovery in Databases, 2019, https://doi.org/10.1007/978-3-030-43823-4_24, Piatkowski.2019,