Emergent Mind

A memristive deep belief neural network based on silicon synapses

(2203.09046)
Published Mar 17, 2022 in physics.app-ph , cond-mat.dis-nn , cond-mat.mtrl-sci , and cs.ET

Abstract

Memristor-based neuromorphic computing could overcome the limitations of traditional von Neumann computing architectures -- in which data are shuffled between separate memory and processing units -- and improve the performance of deep neural networks. However, this will require accurate synaptic-like device performance, and memristors typically suffer from poor yield and a limited number of reliable conductance states. Here we report floating gate memristive synaptic devices that are fabricated in a commercial complementary metal-oxide-semiconductor (CMOS) process. These silicon synapses offer analogue tunability, high endurance, long retention times, predictable cycling degradation, moderate device-to-device variations, and high yield. They also provide two orders of magnitude higher energy efficiency for multiply-accumulate operations than graphics processing units. We use two 12-by-8 arrays of the memristive devices for in-situ training of a 19-by-8 memristive restricted Boltzmann machine for pattern recognition via a gradient descent algorithm based on contrastive divergence. We then create a memristive deep belief neural network consisting of three memristive restricted Boltzmann machines. We test this on the modified National Institute of Standards and Technology (MNIST) dataset, demonstrating recognition accuracy up to 97.05%.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.