What problems are you currently working on?
- Dynamic generative language models, that is, language models that leverage representations from point process models in time to capture how e.g. reviews or news evolve with time
- Learning both discrete and continuous representation for generative models of text using mutual information metrics
- Neural network models for switching dynamical systems
- Generative neural network models for point processes and queueing systems
What are you particularly interested in?
I’m interested in using Bayesian nonparametric processes as prior distributions in deep generative models of text, with the goal of representation learning for text summarization and controllable text generation.