Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Distributed estimation through parallel approximants (2112.15572v2)

Published 31 Dec 2021 in stat.ME, cs.DC, math.ST, and stat.TH

Abstract: Designing scalable estimation algorithms is a core challenge in modern statistics. Here we introduce a framework to address this challenge based on parallel approximants, which yields estimators with provable properties that operate on the entirety of very large, distributed data sets. We first formalize the class of statistics which admit straightforward calculation in distributed environments through independent parallelization. We then show how to use such statistics to approximate arbitrary functional operators in appropriate spaces, yielding a general estimation framework that does not require data to reside entirely in memory. We characterize the $L2$ approximation properties of our approach and provide fully implemented examples of sample quantile calculation and local polynomial regression in a distributed computing environment. A variety of avenues and extensions remain open for future work.

Summary

We haven't generated a summary for this paper yet.