Emergent Mind

Ratio convergence rates for Euclidean first-passage percolation: Applications to the graph infinity Laplacian

(2210.09023)
Published Oct 17, 2022 in math.PR , cs.NA , math.AP , and math.NA

Abstract

In this paper we prove the first quantitative convergence rates for the graph infinity Laplace equation for length scales at the connectivity threshold. In the graph-based semi-supervised learning community this equation is also known as Lipschitz learning. The graph infinity Laplace equation is characterized by the metric on the underlying space, and convergence rates follow from convergence rates for graph distances. At the connectivity threshold, this problem is related to Euclidean first passage percolation, which is concerned with the Euclidean distance function $d{h}(x,y)$ on a homogeneous Poisson point process on $\mathbb{R}d$, where admissible paths have step size at most $h>0$. Using a suitable regularization of the distance function and subadditivity we prove that ${d{hs}(0,se1)}/ s \to \sigma$ as $s\to\infty$ almost surely where $\sigma \geq 1$ is a dimensional constant and $hs\gtrsim \log(s)\frac{1}{d}$. A convergence rate is not available due to a lack of approximate superadditivity when $hs\to \infty$. Instead, we prove convergence rates for the ratio $\frac{d{h}(0,se1)}{d{h}(0,2se1)}\to \frac{1}{2}$ when $h$ is frozen and does not depend on $s$. Combining this with the techniques that we developed in (Bungert, Calder, Roith, IMA Journal of Numerical Analysis, 2022), we show that this notion of ratio convergence is sufficient to establish uniform convergence rates for solutions of the graph infinity Laplace equation at percolation length scales.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.