Towards Automated Analysis of Gaze Behavior from Consumer VR Devices for Neurological Diagnosis

Recent studies have demonstrated that eye tracking is a valuable tool in the detection, classification and staging of neurodegenerative diseases such as Parkinson’s Disease

(PD). However, traditional methods for capturing gaze data often rely on expensive and non-engaging clinical equipment such as video-oculography, limiting their accessibility and

scalability. In this work, we investigate the feasibility of using eye tracking data collected via consumer-grade virtual reality (VR) headsets to support neurological diagnostics in a

more accessible and user-friendly manner.

This approach enables large-scale, low-cost, and remote assessments, which are particularly valuable in early detection and monitoring of neurodegenerative conditions. We show

that relevant oculomotor features extracted from VR-based eye tracking can be used for predictive assessment. Despite the inherent noise and lower precision of consumer devices,

careful preprocessing and robust feature engineering, including deep learning embeddings, mitigate these limitations. Our results demonstrate that both handcrafted and learned fea

tures from gaze behavior enable promising levels of classification performance. This research represents an important step towards scalable, automated, and accessible diagnostic tools for neurodegenerative diseases using ubiquitous VR technology.

Informationen zur Zitierung

Schmitz, Lio; Plack, Markus; Koyak, Berkan; Ullah, Muhammad Ehsan; Aziz, Ahmad; Klein, Reinhard; Lähner, Zorah; Dröge, Hannah: Towards Automated Analysis of Gaze Behavior from Consumer VR Devices for Neurological Diagnosis, Pacific Symposium on Biocomputing (PSB), 2026, https://psb.stanford.edu/psb-online/proceedings/psb26/14631_9789819824748_master.pdf, Schmitz.etal.2026a,