Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

An Interpretable End-to-end Fine-tuning Approach for Long Clinical Text (2011.06504v1)

Published 12 Nov 2020 in cs.CL and cs.CY

Abstract: Unstructured clinical text in EHRs contains crucial information for applications including decision support, trial matching, and retrospective research. Recent work has applied BERT-based models to clinical information extraction and text classification, given these models' state-of-the-art performance in other NLP domains. However, BERT is difficult to apply to clinical notes because it doesn't scale well to long sequences of text. In this work, we propose a novel fine-tuning approach called SnipBERT. Instead of using entire notes, SnipBERT identifies crucial snippets and then feeds them into a truncated BERT-based model in a hierarchical manner. Empirically, SnipBERT not only has significant predictive performance gain across three tasks but also provides improved interpretability, as the model can identify key pieces of text that led to its prediction.

Citations (4)

Summary

We haven't generated a summary for this paper yet.