Non-Parametric Learning of Stochastic Differential Equations with Non-asymptotic Fast Rates of Convergence (2305.15557v3)
Abstract: We propose a novel non-parametric learning paradigm for the identification of drift and diffusion coefficients of multi-dimensional non-linear stochastic differential equations, which relies upon discrete-time observations of the state. The key idea essentially consists of fitting a RKHS-based approximation of the corresponding Fokker-Planck equation to such observations, yielding theoretical estimates of non-asymptotic learning rates which, unlike previous works, become increasingly tighter when the regularity of the unknown drift and diffusion coefficients becomes higher. Our method being kernel-based, offline pre-processing may be profitably leveraged to enable efficient numerical implementation, offering excellent balance between precision and computational complexity.
- H. Lavenant et al., Towards a Mathematical Theory of Trajectory Inference. arXiv:2102.09204, 2021.
- S. Nakakita, Parametric Estimation of Stochastic Differential Equations via Online Gradient Descent. arXiv:2210.08800, 2022.
- arxiv:2306.12878, 2024.
- W. Xu et al., Infinitely Deep Bayesian Neural Networks with Stochastic Differential Equations. arXiv:2102.06559, 2022.
- Riccardo Bonalli (31 papers)
- Alessandro Rudi (70 papers)