Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

FPGA Deployment of LFADS for Real-time Neuroscience Experiments (2402.04274v1)

Published 2 Feb 2024 in q-bio.NC, cs.LG, and cs.NE

Abstract: Large-scale recordings of neural activity are providing new opportunities to study neural population dynamics. A powerful method for analyzing such high-dimensional measurements is to deploy an algorithm to learn the low-dimensional latent dynamics. LFADS (Latent Factor Analysis via Dynamical Systems) is a deep learning method for inferring latent dynamics from high-dimensional neural spiking data recorded simultaneously in single trials. This method has shown a remarkable performance in modeling complex brain signals with an average inference latency in milliseconds. As our capacity of simultaneously recording many neurons is increasing exponentially, it is becoming crucial to build capacity for deploying low-latency inference of the computing algorithms. To improve the real-time processing ability of LFADS, we introduce an efficient implementation of the LFADS models onto Field Programmable Gate Arrays (FPGA). Our implementation shows an inference latency of 41.97 $\mu$s for processing the data in a single trial on a Xilinx U55C.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (13)
  1. John P Cunningham and Byron M Yu. 2014. Dimensionality reduction for large-scale neural recordings. Nature neuroscience 17, 11 (2014), 1500–1509.
  2. TensorFlow Developers. 2023. TensorFlow. https://doi.org/10.5281/zenodo.8381573 Specific TensorFlow versions can be found in the ”Versions” list on the right side of this page.¡br¿See the full list of authors ¡a href=”htt ps://github.com/tensorflow/tensorflow/graphs/contr ibutors”¿on GitHub¡/a¿..
  3. QKeras. https://github.com/google/qkeras
  4. Keras. https://https://keras.io/
  5. FastML Team. 2021. fastmachinelearning/hls4ml. https://doi.org/10.5281/zenodo.1201549
  6. Targeted Neural Dynamical Modeling. arXiv:2110.14853 [q-bio.NC]
  7. Self-Normalizing Neural Networks. CoRR abs/1706.02515 (2017). arXiv:1706.02515 http://arxiv.org/abs/1706.02515
  8. Chethan Pandarinath et al. 2018. Inferring single-trial neural population dynamics using sequential auto-encoders. Nat Methods 15, 10 (Oct 2018), 805–815. https://doi.org/10.1038/s41592-018-0109-9
  9. A Neural Population Mechanism for Rapid Learning. Neuron 100, 4 (Nov 2018), 964–976.e7. https://doi.org/10.1016/j.neuron.2018.09.030
  10. LFADS - Latent Factor Analysis via Dynamical Systems. arXiv (22 Aug 2016). http://arxiv.org/abs/1608.06315 Accessed: Aug. 03, 2023.
  11. Xilinx. 2023a. Overview of Arbitrary Precision Fixed-Point Data Types. https://docs.xilinx.com/r/en-US/ug1399-vitis-hls/Overview-of-Arbitrary-Precision-Fixed-Point-Data-Types
  12. Xilinx. 2023b. Xilinx DSP48E2 block. https://docs.xilinx.com/r/en-US/ug958-vivado-sysgen-ref/DSP48E2
  13. Guangyu Robert Yang and Xiao-Jing Wang. 2020. Artificial Neural Networks for Neuroscientists: A Primer. Neuron 107, 6 (2020), 1048–1070. https://doi.org/10.1016/j.neuron.2020.09.005

Summary

We haven't generated a summary for this paper yet.