Decision Snippet Features

Decision trees excel at interpretability of their prediction results. To achieve required prediction accuracies, however, often large ensembles of decision trees — random forests — are considered, reducing interpretability due to large size. Additionally, their size slows down inference on modern hardware and restricts their applicability in low-memory embedded devices. We introduce emph{Decision Snippet Features}, which are obtained from small subtrees that appear frequently in trained random forests. We subsequently show that linear models on top of these features achieve comparable and sometimes even better predictive performance than the original random forest, while reducing the model size by up to two orders of magnitude.

  • Published in:
    2020 25th International Conference on Pattern Recognition (ICPR) International Conference on Pattern Recognition (ICPR)
  • Type:
    Inproceedings
  • Authors:
    P. Welke, F. Alkhoury, C. Bauckhage, S. Wrobel
  • Year:
    2021

Citation information

P. Welke, F. Alkhoury, C. Bauckhage, S. Wrobel: Decision Snippet Features, International Conference on Pattern Recognition (ICPR), 2020 25th International Conference on Pattern Recognition (ICPR), 2021, 4260-4267, https://doi.org/10.1109/ICPR48806.2021.9412025, Welke.etal.2021,