Theories of synaptic memory consolidation and intelligent plasticity for continual learning (2405.16922v2)
Abstract: Humans and animals learn throughout life. Such continual learning is crucial for intelligence. In this chapter, we examine the pivotal role plasticity mechanisms with complex internal synaptic dynamics could play in enabling this ability in neural networks. By surveying theoretical research, we highlight two fundamental enablers for continual learning. First, synaptic plasticity mechanisms must maintain and evolve an internal state over several behaviorally relevant timescales. Second, plasticity algorithms must leverage the internal state to intelligently regulate plasticity at individual synapses to facilitate the seamless integration of new memories while avoiding detrimental interference with existing ones. Our chapter covers successful applications of these principles to deep neural networks and underscores the significance of synaptic metaplasticity in sustaining continual learning capabilities. Finally, we outline avenues for further research to understand the brain's superb continual learning abilities and harness similar mechanisms for artificial intelligence systems.
- Metaplasticity: Tuning synapses and networks for plasticity. Nature Reviews Neuroscience 9, 387–387. URL: http://www.nature.com/nrn/journal/v9/n5/abs/nrn2356.html, doi:10.1038/nrn2356.
- Memory retention – the synaptic stability versus plasticity dilemma. Trends in Neurosciences 28, 73–78. URL: http://www.sciencedirect.com/science/article/pii/S0166223604003704, doi:10.1016/j.tins.2004.12.003.
- Mechanisms of memory stabilization: are consolidation and reconsolidation similar or distinct processes? Trends in neurosciences 28, 51–6. URL: http://www.ncbi.nlm.nih.gov/pubmed/15626497, doi:10.1016/j.tins.2004.11.001.
- Learning Patterns and Pattern Sequences by Self-Organizing Nets of Threshold Elements. IEEE Transactions on Computers C-21, 1197 – 1206. doi:10.1109/T-C.1972.223477.
- Learning in Neural Networks with Material Synapses. Neural Computation 6, 957–982. URL: http://dx.doi.org/10.1162/neco.1994.6.5.957, doi:10.1162/neco.1994.6.5.957.
- Storing infinite number of patterns in a spin-glass model of neural networks. Phys. Rev. Lett. 55, 1530–1533.
- State Based Model of Long-Term Potentiation and Synaptic Tagging and Capture. PLoS Comput Biol 5, e1000259. URL: http://dx.doi.org/10.1371/journal.pcbi.1000259, doi:10.1371/journal.pcbi.1000259.
- GateON: an unsupervised method for large scale continual learning. URL: http://arxiv.org/abs/2306.01690, doi:10.48550/arXiv.2306.01690. arXiv:2306.01690 [cs].
- Computational principles of synaptic memory consolidation. Nature Neuroscience 19, 1697–1706. URL: https://www.nature.com/articles/nn.4401, doi:10.1038/nn.4401.
- Theory of the development of neuron selectivity: orientation specificity and binocular interaction in visual cortex. J. Neurosci. 2, 32–48.
- Tag-Trigger-Consolidation: A Model of Early and Late Long-Term-Potentiation and Depression. PLoS Comput Biol 4, e1000248. URL: http://dx.doi.org/10.1371/journal.pcbi.1000248, doi:10.1371/journal.pcbi.1000248.
- The BCM theory of synapse modification at 30: interaction of theory with experiment. Nature Reviews Neuroscience 13, 798–810. URL: https://www.nature.com/articles/nrn3353, doi:10.1038/nrn3353. number: 11 Publisher: Nature Publishing Group.
- Binarized Neural Networks: Training Deep Neural Networks with Weights and Activations Constrained to +1 or -1. arXiv:1602.02830 [cs] URL: http://arxiv.org/abs/1602.02830. arXiv: 1602.02830.
- Comparing continual task learning in minds and machines. Proceedings of the National Academy of Sciences , 201800755URL: http://www.pnas.org/content/early/2018/10/12/1800755115, doi:10.1073/pnas.1800755115.
- Cascade models of synaptically stored memories. Neuron 45, 599–611. URL: http://www.ncbi.nlm.nih.gov/pubmed/15721245, doi:10.1016/j.neuron.2005.02.001.
- Deep Learning. MIT Press, Cambridge, Massachusetts.
- Competitive Learning: From Interactive Activation to Adaptive Resonance. Cognitive Science 11, 23–63. URL: http://onlinelibrary.wiley.com/doi/10.1111/j.1551-6708.1987.tb00862.x/abstract, doi:10.1111/j.1551-6708.1987.tb00862.x.
- The Organization of Behavior: A Neuropsychological Theory. Wiley & Sons New York.
- Introduction to the theory of neural computation. volume 1 of Santa Fe Institute Studies In The Sciences of Complexity Lecture Notes. Westview Press.
- Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences of the United States of America 79, 2554. URL: http://www.pnas.org/cgi/content/abstract/79/8/2554.
- Avoiding Catastrophe: Active Dendrites Enable Multi-Task Learning in Dynamic Environments. arXiv:2201.00042 [cs, q-bio] URL: http://arxiv.org/abs/2201.00042. arXiv: 2201.00042.
- Contributions by metaplasticity to solving the Catastrophic Forgetting Problem. Trends in Neurosciences 0. URL: https://www.cell.com/trends/neurosciences/abstract/S0166-2236(22)00120-5, doi:10.1016/j.tins.2022.06.002. publisher: Elsevier.
- Continual Reinforcement Learning with Complex Synapses. arXiv:1802.07239 [cs] URL: http://arxiv.org/abs/1802.07239. arXiv: 1802.07239.
- Overcoming catastrophic forgetting in neural networks. Proceedings of the National Academy of Sciences , 201611835URL: http://www.pnas.org/content/early/2017/03/13/1611835114, doi:10.1073/pnas.1611835114.
- Biological underpinnings for lifelong learning machines. Nature Machine Intelligence 4, 196–210. URL: https://www.nature.com/articles/s42256-022-00452-0, doi:10.1038/s42256-022-00452-0. number: 3 Publisher: Nature Publishing Group.
- Synaptic metaplasticity in binarized neural networks. Nature Communications 12, 2549. URL: https://www.nature.com/articles/s41467-021-22768-y, doi:10.1038/s41467-021-22768-y. number: 1 Publisher: Nature Publishing Group.
- A memory frontier for complex synapses. Advances in Neural Information Processing Systems 26, 1034–1042. URL: http://papers.nips.cc/paper/4872-a-memory-frontier-for-complex-synapses.
- Memory consolidation and improvement by synaptic tagging and capture in recurrent neural networks. Communications Biology 4, 1–17. URL: https://www.nature.com/articles/s42003-021-01778-y, doi:10.1038/s42003-021-01778-y. publisher: Nature Publishing Group.
- Alleviating catastrophic forgetting using context-dependent gating and synaptic stabilization. Proceedings of the National Academy of Sciences 115, E10467–E10475. URL: http://www.pnas.org/content/115/44/E10467, doi:10.1073/pnas.1803839115.
- Why there are Complementary Learning Systems in the Hippocampus and Neocortex: Insights from the Successes and Failures of Connectionist Models of Learning and Memory. Psychological Review 102, 419–457.
- Networks of Formal Neurons and Memory Palimpsests. EPL (Europhysics Letters) 1, 535. URL: http://stacks.iop.org/0295-5075/1/i=10/a=008, doi:10.1209/0295-5075/1/10/008.
- Continual lifelong learning with neural networks: A review. Neural Networks 113, 54–71. URL: https://www.sciencedirect.com/science/article/pii/S0893608019300231, doi:10.1016/j.neunet.2019.01.012.
- Making memories last: the synaptic tagging and capture hypothesis. Nature Reviews Neuroscience 12, 17–30. URL: http://www.nature.com/nrn/journal/v12/n1/abs/nrn2963.html, doi:10.1038/nrn2963.
- A deep learning framework for neuroscience. Nature Neuroscience 22, 1761–1770. URL: https://www.nature.com/articles/s41593-019-0520-2, doi:10.1038/s41593-019-0520-2.
- Complementary learning systems within the hippocampus: a neural network modelling approach to reconciling episodic memory with statistical learning. Philosophical Transactions of the Royal Society B: Biological Sciences 372, 20160049. URL: https://royalsocietypublishing.org/doi/full/10.1098/rstb.2016.0049, doi:10.1098/rstb.2016.0049. publisher: Royal Society.
- Coordinated hippocampal-thalamic-cortical communication crucial for engram dynamics underneath systems consolidation. Nature Communications 13, 840. URL: https://www.nature.com/articles/s41467-022-28339-z, doi:10.1038/s41467-022-28339-z. publisher: Nature Publishing Group.
- Brain-inspired replay for continual learning with artificial neural networks. Nature Communications 11, 4069. URL: https://www.nature.com/articles/s41467-020-17866-2, doi:10.1038/s41467-020-17866-2. number: 1 Publisher: Nature Publishing Group.
- Three types of incremental learning. Nature Machine Intelligence , 1–13URL: https://www.nature.com/articles/s42256-022-00568-3, doi:10.1038/s42256-022-00568-3. publisher: Nature Publishing Group.
- Diverse synaptic plasticity mechanisms orchestrated to form and retrieve memories in spiking neural networks. Nature Communications 6, 6922. URL: https://www.nature.com/articles/ncomms7922, doi:10.1038/ncomms7922.
- Hebbian plasticity requires compensatory processes on multiple timescales. Philosophical Transactions of the Royal Society B 372, 20160259. URL: http://rstb.royalsocietypublishing.org/content/372/1715/20160259, doi:10.1098/rstb.2016.0259.
- Continual Learning Through Synaptic Intelligence, in: ICML, pp. 3987–3995. URL: http://proceedings.mlr.press/v70/zenke17a.html.
- Synaptic Consolidation: From Synapses to Behavioral Modeling. The Journal of Neuroscience 35, 1319–1334. URL: http://www.jneurosci.org/content/35/3/1319, doi:10.1523/JNEUROSCI.3989-14.2015.