How Much User Context Do We Need? Privacy by Design in Mental Health {NLP} Applications

Clinical {NLP} tasks such as mental health assessment from text, must take social constraints into account – the performance maximization must be constrained by the utmost importance of guaranteeing privacy of user data. Consumer protection regulations, such as {GDPR}, generally handle privacy by restricting data availability, such as requiring to limit user data to ‘what is necessary’ for a given purpose. In this work, we reason that providing stricter formal privacy guarantees, while increasing the volume of user data in the model, in most cases increases benefit for all parties involved, especially for the user. We demonstrate our arguments on two existing suicide risk assessment datasets of Twitter and Reddit posts. We present the first analysis juxtaposing user history length and differential privacy budgets and elaborate how modeling additional user context enables utility preservation while maintaining acceptable user privacy guarantees.

  • Published in:
    Proceedings of the International {AAAI} Conference on Web and Social Media International AAAI Conference on Web and Social Media
  • Type:
    Inproceedings
  • Authors:
    Sawhney, Ramit; Neerkaje, Atula; Habernal, Ivan; Flek, Lucie
  • Year:
    2023

Citation information

Sawhney, Ramit; Neerkaje, Atula; Habernal, Ivan; Flek, Lucie: How Much User Context Do We Need? Privacy by Design in Mental Health {NLP} Applications, International AAAI Conference on Web and Social Media, Proceedings of the International {AAAI} Conference on Web and Social Media, 2023, 766--776, https://ojs.aaai.org/index.php/ICWSM/article/view/22186, Sawhney.etal.2023a,

Associated Lamarr Researchers

Prof. Dr. Lucie Flek

Prof. Dr. Lucie Flek

Area Chair NLP to the profile