Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Variable Skipping for Autoregressive Range Density Estimation (2007.05572v1)

Published 10 Jul 2020 in cs.LG, cs.DB, and stat.ML

Abstract: Deep autoregressive models compute point likelihood estimates of individual data points. However, many applications (i.e., database cardinality estimation) require estimating range densities, a capability that is under-explored by current neural density estimation literature. In these applications, fast and accurate range density estimates over high-dimensional data directly impact user-perceived performance. In this paper, we explore a technique, variable skipping, for accelerating range density estimation over deep autoregressive models. This technique exploits the sparse structure of range density queries to avoid sampling unnecessary variables during approximate inference. We show that variable skipping provides 10-100$\times$ efficiency improvements when targeting challenging high-quantile error metrics, enables complex applications such as text pattern matching, and can be realized via a simple data augmentation procedure without changing the usual maximum likelihood objective.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Eric Liang (15 papers)
  2. Zongheng Yang (11 papers)
  3. Ion Stoica (177 papers)
  4. Pieter Abbeel (372 papers)
  5. Yan Duan (45 papers)
  6. Xi Chen (1036 papers)
Citations (4)

Summary

We haven't generated a summary for this paper yet.