Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

DiffusionBERT: Improving Generative Masked Language Models with Diffusion Models (2211.15029v2)

Published 28 Nov 2022 in cs.CL, cs.AI, and cs.LG

Abstract: We present DiffusionBERT, a new generative masked LLM based on discrete diffusion models. Diffusion models and many pre-trained LLMs have a shared training objective, i.e., denoising, making it possible to combine the two powerful models and enjoy the best of both worlds. On the one hand, diffusion models offer a promising training strategy that helps improve the generation quality. On the other hand, pre-trained denoising LLMs (e.g., BERT) can be used as a good initialization that accelerates convergence. We explore training BERT to learn the reverse process of a discrete diffusion process with an absorbing state and elucidate several designs to improve it. First, we propose a new noise schedule for the forward diffusion process that controls the degree of noise added at each step based on the information of each token. Second, we investigate several designs of incorporating the time step into BERT. Experiments on unconditional text generation demonstrate that DiffusionBERT achieves significant improvement over existing diffusion models for text (e.g., D3PM and Diffusion-LM) and previous generative masked LLMs in terms of perplexity and BLEU score.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Zhengfu He (10 papers)
  2. Tianxiang Sun (35 papers)
  3. Kuanning Wang (3 papers)
  4. Xuanjing Huang (287 papers)
  5. Xipeng Qiu (257 papers)
Citations (93)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com