Differentially Private Nonparametric Regression Under a Growth Condition (2111.12786v1)
Abstract: Given a real-valued hypothesis class $\mathcal{H}$, we investigate under what conditions there is a differentially private algorithm which learns an optimal hypothesis from $\mathcal{H}$ given i.i.d. data. Inspired by recent results for the related setting of binary classification (Alon et al., 2019; Bun et al., 2020), where it was shown that online learnability of a binary class is necessary and sufficient for its private learnability, Jung et al. (2020) showed that in the setting of regression, online learnability of $\mathcal{H}$ is necessary for private learnability. Here online learnability of $\mathcal{H}$ is characterized by the finiteness of its $\eta$-sequential fat shattering dimension, ${\rm sfat}\eta(\mathcal{H})$, for all $\eta > 0$. In terms of sufficient conditions for private learnability, Jung et al. (2020) showed that $\mathcal{H}$ is privately learnable if $\lim{\eta \downarrow 0} {\rm sfat}\eta(\mathcal{H})$ is finite, which is a fairly restrictive condition. We show that under the relaxed condition $\lim \inf{\eta \downarrow 0} \eta \cdot {\rm sfat}\eta(\mathcal{H}) = 0$, $\mathcal{H}$ is privately learnable, establishing the first nonparametric private learnability guarantee for classes $\mathcal{H}$ with ${\rm sfat}\eta(\mathcal{H})$ diverging as $\eta \downarrow 0$. Our techniques involve a novel filtering procedure to output stable hypotheses for nonparametric function classes.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.