Papers
Topics
Authors
Recent
2000 character limit reached

A memory enhanced LSTM for modeling complex temporal dependencies (1910.12388v1)

Published 25 Oct 2019 in cs.LG and stat.ML

Abstract: In this paper, we present Gamma-LSTM, an enhanced long short term memory (LSTM) unit, to enable learning of hierarchical representations through multiple stages of temporal abstractions. Gamma memory, a hierarchical memory unit, forms the central memory of Gamma-LSTM with gates to regulate the information flow into various levels of hierarchy, thus providing the unit with a control to pick the appropriate level of hierarchy to process the input at a given instant of time. We demonstrate better performance of Gamma-LSTM model regular and stacked LSTMs in two settings (pixel-by-pixel MNIST digit classification and natural language inference) placing emphasis on the ability to generalize over long sequences.

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.