Emergent Mind

Differentially Private Nonparametric Regression Under a Growth Condition

(2111.12786)
Published Nov 24, 2021 in cs.LG , cs.CR , and stat.ML

Abstract

Given a real-valued hypothesis class $\mathcal{H}$, we investigate under what conditions there is a differentially private algorithm which learns an optimal hypothesis from $\mathcal{H}$ given i.i.d. data. Inspired by recent results for the related setting of binary classification (Alon et al., 2019; Bun et al., 2020), where it was shown that online learnability of a binary class is necessary and sufficient for its private learnability, Jung et al. (2020) showed that in the setting of regression, online learnability of $\mathcal{H}$ is necessary for private learnability. Here online learnability of $\mathcal{H}$ is characterized by the finiteness of its $\eta$-sequential fat shattering dimension, ${\rm sfat}\eta(\mathcal{H})$, for all $\eta > 0$. In terms of sufficient conditions for private learnability, Jung et al. (2020) showed that $\mathcal{H}$ is privately learnable if $\lim{\eta \downarrow 0} {\rm sfat}\eta(\mathcal{H})$ is finite, which is a fairly restrictive condition. We show that under the relaxed condition $\lim \inf{\eta \downarrow 0} \eta \cdot {\rm sfat}\eta(\mathcal{H}) = 0$, $\mathcal{H}$ is privately learnable, establishing the first nonparametric private learnability guarantee for classes $\mathcal{H}$ with ${\rm sfat}\eta(\mathcal{H})$ diverging as $\eta \downarrow 0$. Our techniques involve a novel filtering procedure to output stable hypotheses for nonparametric function classes.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.