Emergent Mind

Hierarchical Low-Rank Approximation of Regularized Wasserstein Distance

(2004.12511)
Published Apr 27, 2020 in math.NA and cs.NA

Abstract

Sinkhorn divergence is a measure of dissimilarity between two probability measures. It is obtained through adding an entropic regularization term to Kantorovich's optimal transport problem and can hence be viewed as an entropically regularized Wasserstein distance. Given two discrete probability vectors in the $n$-simplex and supported on two bounded spaces in ${\mathbb R}d$, we present a fast method for computing Sinkhorn divergence when the cost matrix can be decomposed into a $d$-term sum of asymptotically smooth Kronecker product factors. The method combines Sinkhorn's matrix scaling iteration with a low-rank hierarchical representation of the scaling matrices to achieve a near-linear complexity ${\mathcal O}(n \log3 n)$. This provides a fast and easy-to-implement algorithm for computing Sinkhorn divergence, enabling its applicability to large-scale optimization problems, where the computation of classical Wasserstein metric is not feasible. We present a numerical example related to signal processing to demonstrate the applicability of quadratic Sinkhorn divergence in comparison with quadratic Wasserstein distance and to verify the accuracy and efficiency of the proposed method.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.