Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
166 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Differentially Private Generalized Linear Models Revisited (2205.03014v2)

Published 6 May 2022 in cs.LG and stat.ML

Abstract: We study the problem of $(\epsilon,\delta)$-differentially private learning of linear predictors with convex losses. We provide results for two subclasses of loss functions. The first case is when the loss is smooth and non-negative but not necessarily Lipschitz (such as the squared loss). For this case, we establish an upper bound on the excess population risk of $\tilde{O}\left(\frac{\Vert w*\Vert}{\sqrt{n}} + \min\left{\frac{\Vert w* \Vert2}{(n\epsilon){2/3}},\frac{\sqrt{d}\Vert w*\Vert2}{n\epsilon}\right}\right)$, where $n$ is the number of samples, $d$ is the dimension of the problem, and $w*$ is the minimizer of the population risk. Apart from the dependence on $\Vert w\ast\Vert$, our bound is essentially tight in all parameters. In particular, we show a lower bound of $\tilde{\Omega}\left(\frac{1}{\sqrt{n}} + {\min\left{\frac{\Vert w*\Vert{4/3}}{(n\epsilon){2/3}}, \frac{\sqrt{d}\Vert w*\Vert}{n\epsilon}\right}}\right)$. We also revisit the previously studied case of Lipschitz losses [SSTT20]. For this case, we close the gap in the existing work and show that the optimal rate is (up to log factors) $\Theta\left(\frac{\Vert w*\Vert}{\sqrt{n}} + \min\left{\frac{\Vert w*\Vert}{\sqrt{n\epsilon}},\frac{\sqrt{\text{rank}}\Vert w*\Vert}{n\epsilon}\right}\right)$, where $\text{rank}$ is the rank of the design matrix. This improves over existing work in the high privacy regime. Finally, our algorithms involve a private model selection approach that we develop to enable attaining the stated rates without a-priori knowledge of $\Vert w*\Vert$.

Citations (14)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets