Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 154 tok/s
Gemini 2.5 Pro 37 tok/s Pro
GPT-5 Medium 21 tok/s Pro
GPT-5 High 23 tok/s Pro
GPT-4o 96 tok/s Pro
Kimi K2 169 tok/s Pro
GPT OSS 120B 347 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Sparse Fourier Transform in Any Constant Dimension with Nearly-Optimal Sample Complexity in Sublinear Time (1604.00845v1)

Published 4 Apr 2016 in cs.DS

Abstract: We consider the problem of computing a $k$-sparse approximation to the Fourier transform of a length $N$ signal. Our main result is a randomized algorithm for computing such an approximation (i.e. achieving the $\ell_2/\ell_2$ sparse recovery guarantees using Fourier measurements) using $O_d(k\log N\log\log N)$ samples of the signal in time domain that runs in time $O_d(k\log{d+3} N)$, where $d\geq 1$ is the dimensionality of the Fourier transform. The sample complexity matches the lower bound of $\Omega(k\log (N/k))$ for non-adaptive algorithms due to \cite{DIPW} for any $k\leq N{1-\delta}$ for a constant $\delta>0$ up to an $O(\log\log N)$ factor. Prior to our work a result with comparable sample complexity $k\log N \log{O(1)}\log N$ and sublinear runtime was known for the Fourier transform on the line \cite{IKP}, but for any dimension $d\geq 2$ previously known techniques either suffered from a polylogarithmic factor loss in sample complexity or required $\Omega(N)$ runtime.

Citations (44)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.