2000 character limit reached
Dynamic Evaluation of Transformer Language Models (1904.08378v1)
Published 17 Apr 2019 in cs.LG, cs.NE, and stat.ML
Abstract: This research note combines two methods that have recently improved the state of the art in LLMing: Transformers and dynamic evaluation. Transformers use stacked layers of self-attention that allow them to capture long range dependencies in sequential data. Dynamic evaluation fits models to the recent sequence history, allowing them to assign higher probabilities to re-occurring sequential patterns. By applying dynamic evaluation to Transformer-XL models, we improve the state of the art on enwik8 from 0.99 to 0.94 bits/char, text8 from 1.08 to 1.04 bits/char, and WikiText-103 from 18.3 to 16.4 perplexity points.