Emergent Mind

Tighter Fourier Transform Complexity Tradeoffs

(1404.1741)
Published Apr 7, 2014 in cs.CC

Abstract

The Fourier Transform is one of the most important linear transformations used in science and engineering. Cooley and Tukey's Fast Fourier Transform (FFT) from 1964 is a method for computing this transformation in time $O(n\log n)$. Achieving a matching lower bound in a reasonable computational model is one of the most important open problems in theoretical computer science. In 2014, improving on his previous work, Ailon showed that if an algorithm speeds up the FFT by a factor of $b=b(n)\geq 1$, then it must rely on computing, as an intermediate "bottleneck" step, a linear mapping of the input with condition number $\Omega(b(n))$. Our main result shows that a factor $b$ speedup implies existence of not just one but $\Omega(n)$ $b$-ill conditioned bottlenecks occurring at $\Omega(n)$ different steps, each causing information from independent (orthogonal) components of the input to either overflow or underflow. This provides further evidence that beating FFT is hard. Our result also gives the first quantitative tradeoff between computation speed and information loss in Fourier computation on fixed word size architectures. The main technical result is an entropy analysis of the Fourier transform under transformations of low trace, which is interesting in its own right.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.