Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 150 tok/s
Gemini 2.5 Pro 47 tok/s Pro
GPT-5 Medium 33 tok/s Pro
GPT-5 High 34 tok/s Pro
GPT-4o 113 tok/s Pro
Kimi K2 211 tok/s Pro
GPT OSS 120B 444 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Spiking Neural Operators for Scientific Machine Learning (2205.10130v2)

Published 17 May 2022 in cs.NE and cs.LG

Abstract: The main computational task of Scientific Machine Learning (SciML) is function regression, required both for inputs as well as outputs of a simulation. Physics-Informed Neural Networks (PINNs) and neural operators (such as DeepONet) have been very effective in solving Partial Differential Equations (PDEs), but they tax computational resources heavily and cannot be readily adopted for edge computing. Here, we address this issue by considering Spiking Neural Networks (SNNs), which have shown promise in reducing energy consumption by two orders of magnitude or more. We present a SNN-based method to perform regression, which has been a challenge due to the inherent difficulty in representing a function's input domain and continuous output values as spikes. We first propose a new method for encoding continuous values into spikes based on a triangular matrix in space and time, and demonstrate its better performance compared to the existing methods. Next, we demonstrate that using a simple SNN architecture consisting of Leaky Integrate and Fire (LIF) activation and two dense layers, we can achieve relatively accurate function regression results. Moreover, we can replace the LIF with a trained Multi-Layer Perceptron (MLP) network and obtain comparable results but three times faster. Then, we introduce the DeepONet, consisting of a branch (typically a Fully-connected Neural Network, FNN) for inputs and a trunk (also a FNN) for outputs. We can build a spiking DeepONet by either replacing the branch or the trunk by a SNN. We demonstrate this new approach for classification using the SNN in the branch, achieving results comparable to the literature. Finally, we design a spiking DeepONet for regression by replacing its trunk with a SNN, and achieve good accuracy for approximating functions as well as inferring solutions of differential equations.

Citations (9)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.