Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 37 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 10 tok/s Pro
GPT-5 High 15 tok/s Pro
GPT-4o 84 tok/s Pro
Kimi K2 198 tok/s Pro
GPT OSS 120B 448 tok/s Pro
Claude Sonnet 4 31 tok/s Pro
2000 character limit reached

Aging Memories Generate More Fluent Dialogue Responses with Memory Augmented Neural Networks (1911.08522v2)

Published 19 Nov 2019 in cs.CL, cs.AI, and cs.LG

Abstract: Memory Networks have emerged as effective models to incorporate Knowledge Bases (KB) into neural networks. By storing KB embeddings into a memory component, these models can learn meaningful representations that are grounded to external knowledge. However, as the memory unit becomes full, the oldest memories are replaced by newer representations. In this paper, we question this approach and provide experimental evidence that conventional Memory Networks store highly correlated vectors during training. While increasing the memory size mitigates this problem, this also leads to overfitting as the memory stores a large number of training latent representations. To address these issues, we propose a novel regularization mechanism named memory dropout which 1) Samples a single latent vector from the distribution of redundant memories. 2) Ages redundant memories thus increasing their probability of overwriting them during training. This fully differentiable technique allows us to achieve state-of-the-art response generation in the Stanford Multi-Turn Dialogue and Cambridge Restaurant datasets.

Citations (1)

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.