Emergent Mind

Abstract

This paper reports the impacts of temperature variation on the inference accuracy of pre-trained all-ferroelectric FinFET deep neural networks, along with plausible design techniques to abate these impacts. We adopted a pre-trained artificial neural network (N.N.) with 96.4% inference accuracy on the MNIST dataset as the baseline. As an aftermath of temperature change, a compact model captured the conductance drift of a programmed cell over a wide range of gate biases. We observed a significant inference accuracy degradation in the analog neural network at 233 K for an N.N. trained at 300 K. Finally, we deployed binary neural networks with "read voltage" optimization to ensure immunity of N.N. to accuracy degradation under temperature variation, maintaining an inference accuracy of 96%. Keywords: Ferroelectric memories

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.