Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Prediction-Constrained Topic Models for Antidepressant Recommendation (1712.00499v1)

Published 1 Dec 2017 in cs.LG and stat.ML

Abstract: Supervisory signals can help topic models discover low-dimensional data representations that are more interpretable for clinical tasks. We propose a framework for training supervised latent Dirichlet allocation that balances two goals: faithful generative explanations of high-dimensional data and accurate prediction of associated class labels. Existing approaches fail to balance these goals by not properly handling a fundamental asymmetry: the intended task is always predicting labels from data, not data from labels. Our new prediction-constrained objective trains models that predict labels from heldout data well while also producing good generative likelihoods and interpretable topic-word parameters. In a case study on predicting depression medications from electronic health records, we demonstrate improved recommendations compared to previous supervised topic models and high- dimensional logistic regression from words alone.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Michael C. Hughes (39 papers)
  2. Gabriel Hope (4 papers)
  3. Leah Weiner (3 papers)
  4. Thomas H. McCoy (2 papers)
  5. Roy H. Perlis (4 papers)
  6. Erik B. Sudderth (18 papers)
  7. Finale Doshi-Velez (134 papers)
Citations (9)

Summary

We haven't generated a summary for this paper yet.