Emergent Mind
Weak Convergence Of Tamed Exponential Integrators for Stochastic Differential Equations
(2304.09496)
Published Apr 19, 2023
in
math.NA
and
cs.NA
Abstract
We prove weak convergence of order one for a class of exponential based integrators for SDEs with non-globally Lipschtiz drift. Our analysis covers tamed versions of Geometric Brownian Motion (GBM) based methods as well as the standard exponential schemes. The numerical performance of both the GBM and exponential tamed methods through four different multi-level Monte Carlo techniques are compared. We observe that for linear noise the standard exponential tamed method requires severe restrictions on the stepsize unlike the GBM tamed method.
We're not able to analyze this paper right now due to high demand.
Please check back later (sorry!).
Generate a summary of this paper on our Pro plan:
We ran into a problem analyzing this paper.