Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 48 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 26 tok/s Pro
GPT-5 High 19 tok/s Pro
GPT-4o 107 tok/s Pro
Kimi K2 205 tok/s Pro
GPT OSS 120B 473 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

Mesh-free error integration in arbitrary dimensions: a numerical study of discrepancy functions (1911.00795v2)

Published 2 Nov 2019 in math.AP, cs.NA, and math.NA

Abstract: We are interested in mesh-free formulas based on the Monte-Carlo methodology for the approximation of multi-dimensional integrals, and we investigate their accuracy when the functions belong to a reproducing-kernel space. A kernel typically captures regularity and qualitative properties of functions "beyond" the standard Sobolev regularity class. We are interested in the issue whether quantitative error bounds can be a priori guaranteed in applications (e.g. mathematical finance but also scientific computing and machine learning). Our main contribution is a numerical study of the error discrepancy function based on a comparison between several numerical strategies, when one varies the choice of the kernel, the number of approximation points, and the dimension of the problem. We consider two strategies in order to localize to a bounded set the standard kernels defined in the whole Euclidian space (exponential, multiquadric, Gaussian, truncated), namely, on one hand the class of periodic kernels defined via a discrete Fourier transform on a lattice and, on the other hand, a class of transport-based kernels. First of all, relying on a Poisson formula on a lattice, together with heuristic arguments, we discuss the derivation of theoretical bounds for the discrepancy function of periodic kernels. Second, for each kernel of interest, we perform the numerical experiments that are required in order to generate the optimal distributions of points and the discrepancy error functions. Our numerical results allow us to validate our theoretical observations and provide us with quantitative estimates for the error made with a kernel-based strategy as opposed to a purely random strategy.

Citations (6)

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube