Emergent Mind

Minimax Bounds for Distributed Logistic Regression

(1910.01625)
Published Oct 3, 2019 in cs.IT , math.IT , math.ST , and stat.TH

Abstract

We consider a distributed logistic regression problem where labeled data pairs $(Xi,Yi)\in \mathbb{R}d\times{-1,1}$ for $i=1,\ldots,n$ are distributed across multiple machines in a network and must be communicated to a centralized estimator using at most $k$ bits per labeled pair. We assume that the data $Xi$ come independently from some distribution $PX$, and that the distribution of $Yi$ conditioned on $Xi$ follows a logistic model with some parameter $\theta\in\mathbb{R}d$. By using a Fisher information argument, we give minimax lower bounds for estimating $\theta$ under different assumptions on the tail of the distribution $P_X$. We consider both $\ell2$ and logistic losses, and show that for the logistic loss our sub-Gaussian lower bound is order-optimal and cannot be improved.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.