Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 134 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 28 tok/s Pro
GPT-5 High 39 tok/s Pro
GPT-4o 101 tok/s Pro
Kimi K2 191 tok/s Pro
GPT OSS 120B 428 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Amplifying Sine Unit: An Oscillatory Activation Function for Deep Neural Networks to Recover Nonlinear Oscillations Efficiently (2304.09759v1)

Published 18 Apr 2023 in cs.LG, cs.NE, and math.DS

Abstract: Many industrial and real life problems exhibit highly nonlinear periodic behaviors and the conventional methods may fall short of finding their analytical or closed form solutions. Such problems demand some cutting edge computational tools with increased functionality and reduced cost. Recently, deep neural networks have gained massive research interest due to their ability to handle large data and universality to learn complex functions. In this work, we put forward a methodology based on deep neural networks with responsive layers structure to deal nonlinear oscillations in microelectromechanical systems. We incorporated some oscillatory and non oscillatory activation functions such as growing cosine unit known as GCU, Sine, Mish and Tanh in our designed network to have a comprehensive analysis on their performance for highly nonlinear and vibrational problems. Integrating oscillatory activation functions with deep neural networks definitely outperform in predicting the periodic patterns of underlying systems. To support oscillatory actuation for nonlinear systems, we have proposed a novel oscillatory activation function called Amplifying Sine Unit denoted as ASU which is more efficient than GCU for complex vibratory systems such as microelectromechanical systems. Experimental results show that the designed network with our proposed activation function ASU is more reliable and robust to handle the challenges posed by nonlinearity and oscillations. To validate the proposed methodology, outputs of our networks are being compared with the results from Livermore solver for ordinary differential equation called LSODA. Further, graphical illustrations of incurred errors are also being presented in the work.

Citations (6)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.