Papers
Topics
Authors
Recent
2000 character limit reached

P$^3$LM: Probabilistically Permuted Prophet Language Modeling for Generative Pre-Training (2210.12339v1)

Published 22 Oct 2022 in cs.CL

Abstract: Conventional autoregressive left-to-right (L2R) sequence generation faces two issues during decoding: limited to unidirectional target sequence modeling, and constrained on strong local dependencies. To address the aforementioned problem, we propose P$3$LM, a probabilistically permuted prophet LLM, which strengthens the modeling of bidirectional information and long token dependencies for sequence generation. Specifically, P$3$LM learns to generate tokens in permuted order upon an order-aware transformer decoder, as well as to generate the corresponding future $N$ tokens with a multi-stream attention mechanism. Extensive experiments are conducted on the GLGE benchmark, which includes four datasets for summarization, two for question generation, one for conversational question answering, and one for dialog response generation, where P$3$LM achieves state-of-the-art results compared with strong publicly available generative pre-training methods.

Citations (1)

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.