Emergent Mind

Abstract

We consider nearest neighbor weighted random walks on the $d$-dimensional box $[n]d$ that are governed by some function $g:[0,1] \ra [0,\iy)$, by which we mean that standing at $x$, a neighbor $y$ of $x$ is picked at random and the walk then moves there with probability $(1/2)g(n{-1}y)/(g(n{-1}y)+g(n{-1}x))$. We do this for $g$ of the form $f{m_n}$ for some function $f$ which assumed to be analytically well-behaved and where $m_n \ra \iy$ as $n \ra \iy$. This class of walks covers an abundance of interesting special cases, e.g., the mean-field Potts model, posterior collapsed Gibbs sampling for Latent Dirichlet allocation and certain Bayesian posteriors for models in nuclear physics. The following are among the results of this paper: \begin{itemize} \item If $f$ is unimodal with negative definite Hessian at its global maximum, then the mixing time of the random walk is $O(n\log n)$. \item If $f$ is multimodal, then the mixing time is exponential in $n$, but we show that there is a simulated annealing scheme governed by $fK$ for an increasing sequence of $K$ that mixes in time $O(n2)$. Using a varying step size that decreases with $K$, this can be taken down to $O(n\log n)$. \item If the process is studied on a general graph rather than the $d$-dimensional box, a simulated annealing scheme expressed in terms of conductances of the underlying network, works similarly. \end{itemize} Several examples are given, including the ones mentioned above.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.