Emergent Mind

The structure of low-complexity Gibbs measures on product spaces

(1810.07278)
Published Oct 16, 2018 in math.PR , cs.IT , math-ph , math.IT , and math.MP

Abstract

Let $K1$, $\dots$, $Kn$ be bounded, complete, separable metric spaces. Let $\lambdai$ be a Borel probability measure on $Ki$ for each $i$. Let $f:\prodi Ki \to \mathbb{R}$ be a bounded and continuous potential function, and let $$\mu(d \mathbf{x})\ \propto\ e{f(\mathbf{x})}\lambda_1(d x1)\cdots \lambdan(d xn)$$ be the associated Gibbs distribution. At each point $\mathbf{x} \in \prodi Ki$, one can define a `discrete gradient' $\nabla f(\mathbf{x},\,\cdot\,)$ by comparing the values of $f$ at all points which differ from $\mathbf{x}$ in at most one coordinate. In case $\prodi Ki = {-1,1}n \subset \mathbb{R}n$, the discrete gradient $\nabla f(\mathbf{x},\,\cdot\,)$ is naturally identified with a vector in $\mathbb{R}n$. This paper shows that a low-complexity' assumption on $\nabla f$ implies that $\mu$ can be approximated by a mixture of other measures, relatively few in number, and most of them close to product measures in the sense of optimal transport. This implies also an approximation to the partition function of $f$ in terms of product measures, along the lines of Chatterjee and Dembo's theory ofnonlinear large deviations'. An important precedent for this work is a result of Eldan in the case $\prodi Ki = {-1,1}n$. Eldan's assumption is that the discrete gradients $\nabla f(\mathbf{x},\,\cdot\,)$ all lie in a subset of $\mathbb{R}n$ that has small Gaussian width. His proof is based on the careful construction of a diffusion in $\mathbb{R}n$ which starts at the origin and ends with the desired distribution on the subset ${-1,1}n$. Here our assumption is a more naive covering-number bound on the set of gradients ${\nabla f(\mathbf{x},\,\cdot\,):\ \mathbf{x} \in \prodi K_i}$, and our proof relies only on basic inequalities of information theory. As a result, it is shorter, and applies to Gibbs measures on arbitrary product spaces.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.