Emergent Mind

Spiking Deep Residual Network

(1805.01352)
Published Apr 28, 2018 in cs.NE , cs.CV , and cs.LG

Abstract

Spiking neural networks (SNNs) have received significant attention for their biological plausibility. SNNs theoretically have at least the same computational power as traditional artificial neural networks (ANNs). They possess potential of achieving energy-efficiency while keeping comparable performance to deep neural networks (DNNs). However, it is still a big challenge to train a very deep SNN. In this paper, we propose an efficient approach to build a spiking version of deep residual network (ResNet). ResNet is considered as a kind of the state-of-the-art convolutional neural networks (CNNs). We employ the idea of converting a trained ResNet to a network of spiking neurons, named Spiking ResNet (S-ResNet). We propose a shortcut conversion model to appropriately scale continuous-valued activations to match firing rates in SNN, and a compensation mechanism to reduce the error caused by discretisation. Experimental results demonstrate that, compared with the state-of-the-art SNN approaches, the proposed Spiking ResNet achieves the best performance on CIFAR-10, CIFAR-100, and ImageNet 2012. Our work is the first time to build a SNN deeper than 40, with comparable performance to ANNs on a large-scale dataset.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.