Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 64 tok/s
Gemini 2.5 Pro 51 tok/s Pro
GPT-5 Medium 23 tok/s Pro
GPT-5 High 21 tok/s Pro
GPT-4o 82 tok/s Pro
Kimi K2 179 tok/s Pro
GPT OSS 120B 454 tok/s Pro
Claude Sonnet 4.5 34 tok/s Pro
2000 character limit reached

Error estimates of local energy regularization for the logarithmic Schrodinger equation (2006.05114v2)

Published 9 Jun 2020 in math.NA and cs.NA

Abstract: The logarithmic nonlinearity has been used in many partial differential equations (PDEs) for modeling problems in various applications.Due to the singularity of the logarithmic function, it introducestremendous difficulties in establishing mathematical theories, as well asin designing and analyzing numerical methods for PDEs with such nonlinearity. Here we take the logarithmic Schr\"odinger equation (LogSE)as a prototype model. Instead of regularizing $f(\rho)=\ln \rho$ in theLogSE directly and globally as being done in the literature, we propose a local energy regularization (LER) for the LogSE byfirst regularizing $F(\rho)=\rho\ln \rho -\rho$ locally near $\rho=0+$ with a polynomial approximation in the energy functional of the LogSE and then obtaining an energy regularized logarithmic Schr\"odinger equation (ERLogSE) via energy variation. Linear convergence is established between the solutions of ERLogSE and LogSE in terms of a small regularization parameter $0<\ep\ll1$. Moreover, the conserved energy of the ERLogSE converges to that of LogSE quadratically, which significantly improvesthe linear convergence rate of the regularization method in the literature. Error estimates are alsopresented for solving the ERLogSE by using Lie-Trotter splittingintegrators. Numerical results are reported to confirm our errorestimates of the LER and of the time-splitting integrators for theERLogSE. Finally our results suggest that the LER performs better than regularizing the logarithmic nonlinearity in the LogSE directly.

Citations (12)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube