Language-Based Deployment Optimization for Random Forests (Invited Paper)
Arising popularity for resource-efficient machine learning models makes random forests and decision trees famous models in recent years. Naturally, these models are tuned, optimized, and transformed to feature maximally low-resource consumption. A subset of these strategies targets the model structure and model logic and therefore induces a trade-off between resource-efficiency and prediction performance. An orthogonal set of approaches targets hardware-specific optimizations, which can improve performance without changing the behavior of the model. Since such hardware-specific optimizations are usually hardware-dependent and inflexible in their realizations, this paper envisions a more general application of such optimization strategies at the level of programming languages. We therefore discuss a set of suitable optimization strategies first in general and envision their application in LLVM IR, i.e.~a flexible and hardware-independent ecosystem.
- Published in:
ACM SIGPLAN/SIGBED International Conference on Languages, Compilers, and Tools for Embedded Systems - Type:
Inproceedings - Authors:
Malcher, Jannik; Biebert, Daniel; Chen, Kuan-Hsun; Buschjäger, Sebastian; Hakert, Christian; Chen, Jian-Jia - Year:
2024
Citation information
Malcher, Jannik; Biebert, Daniel; Chen, Kuan-Hsun; Buschjäger, Sebastian; Hakert, Christian; Chen, Jian-Jia: Language-Based Deployment Optimization for Random Forests (Invited Paper), ACM SIGPLAN/SIGBED International Conference on Languages, Compilers, and Tools for Embedded Systems, 2024, June, Malcher.etal.2024a,
@Inproceedings{Malcher.etal.2024a,
author={Malcher, Jannik; Biebert, Daniel; Chen, Kuan-Hsun; Buschjäger, Sebastian; Hakert, Christian; Chen, Jian-Jia},
title={Language-Based Deployment Optimization for Random Forests (Invited Paper)},
booktitle={ACM SIGPLAN/SIGBED International Conference on Languages, Compilers, and Tools for Embedded Systems},
month={June},
year={2024},
abstract={Arising popularity for resource-efficient machine learning models makes random forests and decision trees famous models in recent years. Naturally, these models are tuned, optimized, and transformed to feature maximally low-resource consumption. A subset of these strategies targets the model structure and model logic and therefore induces a trade-off between resource-efficiency and prediction...}}