Emergent Mind

Exploiting Modern Hardware for High-Dimensional Nearest Neighbor Search

(1712.02912)
Published Dec 8, 2017 in cs.CV , cs.DB , cs.IR , cs.MM , and cs.PF

Abstract

Many multimedia information retrieval or machine learning problems require efficient high-dimensional nearest neighbor search techniques. For instance, multimedia objects (images, music or videos) can be represented by high-dimensional feature vectors. Finding two similar multimedia objects then comes down to finding two objects that have similar feature vectors. In the current context of mass use of social networks, large scale multimedia databases or large scale machine learning applications are more and more common, calling for efficient nearest neighbor search approaches. This thesis builds on product quantization, an efficient nearest neighbor search technique that compresses high-dimensional vectors into short codes. This makes it possible to store very large databases entirely in RAM, enabling low response times. We propose several contributions that exploit the capabilities of modern CPUs, especially SIMD and the cache hierarchy, to further decrease response times offered by product quantization.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.