Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 48 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 26 tok/s Pro
GPT-5 High 19 tok/s Pro
GPT-4o 107 tok/s Pro
Kimi K2 205 tok/s Pro
GPT OSS 120B 473 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

Local Search for Max-Sum Diversification (1607.04557v1)

Published 15 Jul 2016 in cs.DS, cs.CG, and cs.DM

Abstract: We provide simple and fast polynomial time approximation schemes (PTASs) for several variants of the max-sum diversification problem which, in its most basic form, is as follows: Given n points p_1,...,p_n in Rd and an integer k, select k points such that the average Euclidean distance between these points is maximized. This problem commonly appears in information retrieval and web-search in order to select a diverse set of points from the input. In this context, it has recently received a lot of attention. We present new techniques to analyze natural local search algorithms. This leads to a (1-O(1/k))-approximation for distances of negative type, even subject to any matroid constraint of rank k, in time O(n k2 log k), when assuming that distance evaluations and calls to the independence oracle are constant time. Negative type distances include as special cases Euclidean distances and many further natural distances. Our result easily transforms into a PTAS and improves on the only previously known PTAS for this setting, which relies on convex optimization techniques in an n-dimensional space and is impractical for large data sets. In contrast, our procedure has an (optimal) linear dependence on n. Using generalized exchange properties of matroid intersection, we show that a PTAS can be obtained for matroid intersection constraints as well. Moreover, our techniques, being based on local search, are conceptually simple and allow for various extensions. In particular, we get asymptotically optimal O(1)-approximations when combining the classic dispersion function with a monotone submodular objective, which is a very common class of functions to measure diversity and relevance. This result leverages recent advances on local search techniques based on proxy functions to obtain optimal approximations for monotone submodular function maximization subject to a matroid constraint.

Citations (36)

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.