Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning Delays in Spiking Neural Networks using Dilated Convolutions with Learnable Spacings (2306.17670v3)

Published 30 Jun 2023 in cs.NE, cs.LG, cs.AI, and eess.AS

Abstract: Spiking Neural Networks (SNNs) are a promising research direction for building power-efficient information processing systems, especially for temporal tasks such as speech recognition. In SNNs, delays refer to the time needed for one spike to travel from one neuron to another. These delays matter because they influence the spike arrival times, and it is well-known that spiking neurons respond more strongly to coincident input spikes. More formally, it has been shown theoretically that plastic delays greatly increase the expressivity in SNNs. Yet, efficient algorithms to learn these delays have been lacking. Here, we propose a new discrete-time algorithm that addresses this issue in deep feedforward SNNs using backpropagation, in an offline manner. To simulate delays between consecutive layers, we use 1D convolutions across time. The kernels contain only a few non-zero weights - one per synapse - whose positions correspond to the delays. These positions are learned together with the weights using the recently proposed Dilated Convolution with Learnable Spacings (DCLS). We evaluated our method on three datasets: the Spiking Heidelberg Dataset (SHD), the Spiking Speech Commands (SSC) and its non-spiking version Google Speech Commands v0.02 (GSC) benchmarks, which require detecting temporal patterns. We used feedforward SNNs with two or three hidden fully connected layers, and vanilla leaky integrate-and-fire neurons. We showed that fixed random delays help and that learning them helps even more. Furthermore, our method outperformed the state-of-the-art in the three datasets without using recurrent connections and with substantially fewer parameters. Our work demonstrates the potential of delay learning in developing accurate and precise models for temporal data processing. Our code is based on PyTorch / SpikingJelly and available at: https://github.com/Thvnvtos/SNN-delays

Definition Search Book Streamline Icon: https://streamlinehq.com
References (53)
  1. TrueNorth: Design and Tool Flow of a 65 mW 1 Million Neuron Programmable Neurosynaptic Chip. IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, 34(10):1537–1557, oct 2015. ISSN 0278-0070. doi: 10.1109/TCAD.2015.2474396. URL http://ieeexplore.ieee.org/document/7229264/.
  2. Long short-term memory and learning-to-learn in networks of spiking neurons. In S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett (eds.), Advances in Neural Information Processing Systems, volume 31. Curran Associates, Inc., 2018. URL https://proceedings.neurips.cc/paper_files/paper/2018/file/c203d8a151612acf12457e4d67635a95-Paper.pdf.
  3. A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience, 16, 2022. ISSN 1662-453X. doi: 10.3389/fnins.2022.865897. URL https://www.frontiersin.org/articles/10.3389/fnins.2022.865897.
  4. Jeffrey S. Bowers. Parallel Distributed Processing Theory in the Age of Deep Networks. Trends in Cognitive Sciences, pp.  1–12, 2017. ISSN 13646613. doi: 10.1016/j.tics.2017.09.013. URL http://linkinghub.elsevier.com/retrieve/pii/S1364661317302164.
  5. Optimal ANN-SNN conversion for high-accuracy and ultra-low-latency spiking neural networks. In International Conference on Learning Representations, 2022. URL https://openreview.net/forum?id=7B3IJMM1k_M.
  6. The heidelberg spiking data sets for the systematic evaluation of spiking neural networks. IEEE Transactions on Neural Networks and Learning Systems, 33(7):2744–2757, 2022. doi: 10.1109/TNNLS.2020.3044364.
  7. Investigating current-based and gating approaches for accurate and energy-efficient spiking recurrent neural networks. In Elias Pimenidis, Plamen Angelov, Chrisina Jayne, Antonios Papaleonidas, and Mehmet Aydin (eds.), Artificial Neural Networks and Machine Learning – ICANN 2022, pp.  359–370, Cham, 2022. Springer Nature Switzerland. ISBN 978-3-031-15934-3.
  8. Loihi: A Neuromorphic Manycore Processor with On-Chip Learning, 2018. ISSN 02721732.
  9. Optimal conversion of conventional artificial neural networks to spiking neural networks. In International Conference on Learning Representations, 2021. URL https://openreview.net/forum?id=FZ1oTwcXchK.
  10. Incorporating learnable membrane time constant to enhance learning of spiking neural networks. In 2021 IEEE/CVF International Conference on Computer Vision (ICCV), pp.  2641–2651, Los Alamitos, CA, USA, oct 2021a. IEEE Computer Society. doi: 10.1109/ICCV48922.2021.00266. URL https://doi.ieeecomputersociety.org/10.1109/ICCV48922.2021.00266.
  11. Spikingjelly. https://github.com/fangwei123456/spikingjelly, 2020.
  12. Deep residual learning in spiking neural networks. In M. Ranzato, A. Beygelzimer, Y. Dauphin, P.S. Liang, and J. Wortman Vaughan (eds.), Advances in Neural Information Processing Systems, volume 34, pp.  21056–21069. Curran Associates, Inc., 2021b. URL https://proceedings.neurips.cc/paper_files/paper/2021/file/afe434653a898da20044041262b3ac74-Paper.pdf.
  13. SpikingJelly: An open-source machine learning infrastructure platform for spike-based intelligence. Science Advances, 9(40), oct 2023. doi: 10.1126/sciadv.adi1480. URL https://www.science.org/doi/10.1126/sciadv.adi1480.
  14. The SpiNNaker Project. Proceedings of the IEEE, 102(5):652–665, may 2014. doi: 10.1109/JPROC.2014.2304638. URL https://ieeexplore.ieee.org/document/6750072/.
  15. Wulfram Gerstner. Time structure of the activity in neural network models. Phys. Rev. E, 51:738–758, Jan 1995. doi: 10.1103/PhysRevE.51.738. URL https://link.aps.org/doi/10.1103/PhysRevE.51.738.
  16. Spiking Neuron Models: Single Neurons, Populations, Plasticity. Cambridge University Press, 2002. doi: 10.1017/CBO9780511815706.
  17. Beyond weights: Deep learning in spiking neural networks with pure synaptic-delay training, 2023.
  18. Learning hetero-synaptic delays for motion detection in a single layer of spiking neurons. In 2022 IEEE International Conference on Image Processing (ICIP), pp.  3591–3595, 2022. doi: 10.1109/ICIP46576.2022.9897394.
  19. Learning heterogeneous delays in a layer of spiking neurons for fast motion detection. Biological Cybernetics, 2023. doi: 10.1007/s00422-023-00975-8. URL https://laurentperrinet.github.io/publication/grimaldi-23-bc/.
  20. Mitigating catastrophic forgetting in spiking neural networks through threshold modulation. Transactions on Machine Learning Research, 2022. ISSN 2835-8856. URL https://openreview.net/forum?id=15SoThZmtU.
  21. Rmp-snn: Residual membrane potential neuron for enabling deeper high-accuracy and low-latency spiking neural network. In 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp.  13555–13564, Los Alamitos, CA, USA, jun 2020. IEEE Computer Society. doi: 10.1109/CVPR42600.2020.01357. URL https://doi.ieeecomputersociety.org/10.1109/CVPR42600.2020.01357.
  22. Delay-weight plasticity-based supervised learning in optical spiking neural networks. Photon. Res., 9(4):B119–B127, Apr 2021. doi: 10.1364/PRJ.413742. URL https://opg.optica.org/prj/abstract.cfm?URI=prj-9-4-B119.
  23. Memory via temporal delays in weightless spiking neural network, 2022.
  24. Msat: Biologically inspired multi-stage adaptive threshold for conversion of spiking neural networks, 2023.
  25. Eugene M Izhikevich. Polychronization: computation with spikes. Neural Comput, 18(2):245–282, feb 2006. doi: 10.1162/089976606775093882. URL http://dx.doi.org/10.1162/089976606775093882.
  26. Dilated convolution with learnable spacings. In The Eleventh International Conference on Learning Representations, 2023a. URL https://openreview.net/forum?id=Q3-1vRh3HOA.
  27. Dilated convolution with learnable spacings: beyond bilinear interpolation, 2023b.
  28. Adam: A method for stochastic optimization. arXiv, 2017.
  29. Integrator or coincidence detector? The role of the cortical neuron revisited. Trends Neurosci, 19(4):130–7., 1996.
  30. Sgdr: Stochastic gradient descent with warm restarts, 2017.
  31. On the Complexity of Learning for Spiking Neurons with Temporal Coding. Information and Computation, 153(1):26–46, 1999. ISSN 08905401. doi: 10.1006/inco.1999.2806.
  32. Surrogate gradient learning in spiking neural networks. IEEE SIGNAL PROCESSING MAGAZINE, 1053(5888/18), 2018.
  33. Loss shaping enhances exact gradient learning with eventprop in spiking neural networks, 2022.
  34. Empirical study on the efficiency of spiking neural networks with axonal delays, and algorithm-hardware benchmarking. In 2023 IEEE International Symposium on Circuits and Systems (ISCAS), pp.  1–5, 2023. doi: 10.1109/ISCAS46773.2023.10181778.
  35. Neural heterogeneity promotes robust learning. Nature Communications, 12(1):5791, Oct 2021. ISSN 2041-1723. doi: 10.1038/s41467-021-26022-3. URL https://doi.org/10.1038/s41467-021-26022-3.
  36. Sensitivity of noisy neurons to coincident inputs. The Journal of Neuroscience, 31(47):17193–206, nov 2011. ISSN 1529-2401. doi: 10.1523/JNEUROSCI.2482-11.2011. URL http://www.ncbi.nlm.nih.gov/pubmed/22114286.
  37. Speech command recognition based on convolutional spiking neural networks. In 2023 33rd International Conference Radioelektronika (RADIOELEKTRONIKA), pp.  1–5, 2023. doi: 10.1109/RADIOELEKTRONIKA57919.2023.10109082.
  38. SLAYER: Spike layer error reassignment in time. In S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett (eds.), Advances in Neural Information Processing Systems 31, pp.  1419–1428. Curran Associates, Inc., 2018. URL http://papers.nips.cc/paper/7415-slayer-spike-layer-error-reassignment-in-time.pdf.
  39. Super-convergence: Very fast training of neural networks using large learning rates, 2018.
  40. Axonal delay as a short-term memory for feed forward deep spiking neural networks. In ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp.  8932–8936, 2022. doi: 10.1109/ICASSP43922.2022.9747411.
  41. Learnable axonal delay in spiking neural networks improves spoken word recognition. Frontiers in Neuroscience, 17, 2023a. ISSN 1662-453X. doi: 10.3389/fnins.2023.1275944. URL https://www.frontiersin.org/articles/10.3389/fnins.2023.1275944.
  42. Adaptive axonal delays in feedforward spiking neural networks for accurate spoken word recognition, 2023b.
  43. Dl-resume: A delay learning-based remote supervised method for spiking neurons. IEEE Transactions on Neural Networks and Learning Systems, 26(12):3137–3149, 2015. doi: 10.1109/TNNLS.2015.2404938.
  44. Attention is all you need. In I. Guyon, U. Von Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett (eds.), Advances in Neural Information Processing Systems, volume 30. Curran Associates, Inc., 2017. URL https://proceedings.neurips.cc/paper_files/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf.
  45. A delay learning algorithm based on spike train kernels for spiking neurons. Frontiers in Neuroscience, 13, 2019. ISSN 1662-453X. doi: 10.3389/fnins.2019.00252. URL https://www.frontiersin.org/articles/10.3389/fnins.2019.00252.
  46. Pete Warden. Speech commands: A dataset for limited-vocabulary speech recognition. arXiv, 2018.
  47. Temporal-wise attention spiking neural networks for event streams classification. In 2021 IEEE/CVF International Conference on Computer Vision (ICCV), pp.  10201–10210, Los Alamitos, CA, USA, oct 2021. IEEE Computer Society. doi: 10.1109/ICCV48922.2021.01006. URL https://doi.ieeecomputersociety.org/10.1109/ICCV48922.2021.01006.
  48. Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence, 2021. doi: 10.1038/s42256-021-00397-w. URL https://doi.org/10.1038/s42256-021-00397-w.
  49. SENeCA: Scalable Energy-efficient Neuromorphic Computer Architecture. In 2022 IEEE 4th International Conference on Artificial Intelligence Circuits and Systems (AICAS), pp.  371–374. IEEE, jun 2022. ISBN 978-1-6654-0996-4. doi: 10.1109/AICAS54282.2022.9870025. URL https://ieeexplore.ieee.org/document/9870025/.
  50. Stsc-snn: Spatio-temporal synaptic connection with temporal convolution and attention for spiking neural networks. Frontiers in Neuroscience, 16, 2022. ISSN 1662-453X. doi: 10.3389/fnins.2022.1079357. URL https://www.frontiersin.org/articles/10.3389/fnins.2022.1079357.
  51. Supervised learning in spiking neural networks with synaptic delay-weight plasticity. Neurocomputing, 409:103–118, 2020. ISSN 0925-2312. doi: https://doi.org/10.1016/j.neucom.2020.03.079. URL https://www.sciencedirect.com/science/article/pii/S0925231220304665.
  52. Spikformer: When spiking neural network meets transformer. In The Eleventh International Conference on Learning Representations, 2023. URL https://openreview.net/forum?id=frE4fUwz_h.
  53. Spikegpt: Generative pre-trained language model with spiking neural networks. arXiv preprint arXiv:2302.13939, 2023.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
Citations (35)

Summary

We haven't generated a summary for this paper yet.