Modern high-energy astroparticle experiments produce large amounts of data everyday in continuous high-volume streams. The First G-APD Cherenkov Telescope (FACT) aims at detecting particle showers of gamma rays, because cosmic events can be derived from the energy and angle of gamma rays. The separation of gamma rays from background noise, which is inevitably recorded, is called the Gamma-Hadron separation problem. Current solutions heavily rely on hand-crafted features. The current approach computes these features in a long data processing pipeline and trains a random forest classifier for the Gamma-Hadron separation. The overall machine learning pipeline is executed on commodity computer hardware after an event has occurred. In this paper, we propose an alternative approach which applies (Binary) Convolutional Neural Networks (B-CNN) directly to the raw feature stream of the telescope’s camera. We investigate if these models can be executed on commodity hardware available at the telescope to handle its datastream in real time. For fully Binary Neural Networks we also study the use of FPGAs for inference. Our experiments show that this approach outperforms hand-crafted features and random forests by a large margin, while still being applicable in real-time for moderate sized models. Furthermore, we show that our approach does not only work well on simulated data, but also on real cosmic events originating in the Crab Nebula, a supernova remnant.