Papers
Topics
Authors
Recent
Search
2000 character limit reached

An Interpretable End-to-end Fine-tuning Approach for Long Clinical Text

Published 12 Nov 2020 in cs.CL and cs.CY | (2011.06504v1)

Abstract: Unstructured clinical text in EHRs contains crucial information for applications including decision support, trial matching, and retrospective research. Recent work has applied BERT-based models to clinical information extraction and text classification, given these models' state-of-the-art performance in other NLP domains. However, BERT is difficult to apply to clinical notes because it doesn't scale well to long sequences of text. In this work, we propose a novel fine-tuning approach called SnipBERT. Instead of using entire notes, SnipBERT identifies crucial snippets and then feeds them into a truncated BERT-based model in a hierarchical manner. Empirically, SnipBERT not only has significant predictive performance gain across three tasks but also provides improved interpretability, as the model can identify key pieces of text that led to its prediction.

Citations (4)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.