Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Alternating Synthetic and Real Gradients for Neural Language Modeling (1902.10630v2)

Published 27 Feb 2019 in cs.LG and cs.NE

Abstract: Training recurrent neural networks (RNNs) with backpropagation through time (BPTT) has known drawbacks such as being difficult to capture longterm dependencies in sequences. Successful alternatives to BPTT have not yet been discovered. Recently, BP with synthetic gradients by a decoupled neural interface module has been proposed to replace BPTT for training RNNs. On the other hand, it has been shown that the representations learned with synthetic and real gradients are different though they are functionally identical. In this project, we explore ways of combining synthetic and real gradients with application to neural LLMing tasks. Empirically, we demonstrate the effectiveness of alternating training with synthetic and real gradients after periodic warm restarts on LLMing tasks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Fangxin Shang (11 papers)
  2. Hao Zhang (948 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.