Splitting Stump Forests: Tree Ensemble Compression for Edge Devices (extended version)
We introduce Splitting Stump Forests—small ensembles of weak learners extracted from a trained random forest. The high memory consumption of random forests renders them unfit for resource-constrained devices. We show empirically that we can significantly reduce the model size and inference time by selecting nodes that evenly split the arriving training data and applying a linear model on the resulting representation. Our extensive empirical evaluation indicates that Splitting Stump Forests outperform random forests and state-of-the-art compression methods on memory-limited embedded devices.
- Veröffentlicht in:
Machine Learning - Typ:
Article - Autoren:
- Jahr:
2025 - Source:
https://doi.org/10.1007/s10994-025-06866-2
Informationen zur Zitierung
: Splitting Stump Forests: Tree Ensemble Compression for Edge Devices (extended version), Machine Learning, 2025, 114, 10, 219, August, https://doi.org/10.1007/s10994-025-06866-2, Alkhoury.etal.2025c,
@Article{Alkhoury.etal.2025c,
author={Alkhoury, Fouad; Buschjäger, Sebastian; Welke, Pascal},
title={Splitting Stump Forests: Tree Ensemble Compression for Edge Devices (extended version)},
journal={Machine Learning},
volume={114},
number={10},
pages={219},
month={August},
url={https://doi.org/10.1007/s10994-025-06866-2},
year={2025},
abstract={We introduce Splitting Stump Forests—small ensembles of weak learners extracted from a trained random forest. The high memory consumption of random forests renders them unfit for resource-constrained devices. We show empirically that we can significantly reduce the model size and inference time by selecting nodes that evenly split the arriving training data and applying a linear model on the...}}