Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 37 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 10 tok/s Pro
GPT-5 High 15 tok/s Pro
GPT-4o 84 tok/s Pro
Kimi K2 198 tok/s Pro
GPT OSS 120B 448 tok/s Pro
Claude Sonnet 4 31 tok/s Pro
2000 character limit reached

Curse of Dimensionality in the Application of Pivot-based Indexes to the Similarity Search Problem (0905.2141v1)

Published 13 May 2009 in cs.DS

Abstract: In this work we study the validity of the so-called curse of dimensionality for indexing of databases for similarity search. We perform an asymptotic analysis, with a test model based on a sequence of metric spaces $(\Omega_d)$ from which we pick datasets $X_d$ in an i.i.d. fashion. We call the subscript $d$ the dimension of the space $\Omega_d$ (e.g. for $\mathbb{R}d$ the dimension is just the usual one) and we allow the size of the dataset $n=n_d$ to be such that $d$ is superlogarithmic but subpolynomial in $n$. We study the asymptotic performance of pivot-based indexing schemes where the number of pivots is $o(n/d)$. We pick the relatively simple cost model of similarity search where we count each distance calculation as a single computation and disregard the rest. We demonstrate that if the spaces $\Omega_d$ exhibit the (fairly common) concentration of measure phenomenon the performance of similarity search using such indexes is asymptotically linear in $n$. That is for large enough $d$ the difference between using such an index and performing a search without an index at all is negligeable. Thus we confirm the curse of dimensionality in this setting.

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.