04/08/2021

Differentially Private Nonparametric Regression Under a Growth Condition

Noah Golowich

Keywords:

Abstract: Given a real-valued hypothesis class H, we investigate under what conditions there is a differentially private algorithm which learns an optimal hypothesis from H given i.i.d. data. Inspired by recent results for the related setting of binary classification (Alon et al., 2019; Bun et al., 2020), where it was shown that online learnability of a binary class is necessary and sufficient for its private learnability, Jung et al. (2020) showed that in the setting of regression, online learnability of H is necessary for private learnability. Here online learnability of H is characterized by the finiteness of its eta-sequential fat shattering dimension, sfat_eta(H), for all eta > 0. In terms of sufficient conditions for private learnability, Jung et al. (2020) showed that H is privately learnable if lim_{\eta -> 0} sfat_eta(H) is finite, which is a fairly restrictive condition. We show that under the relaxed condition liminf_{eta -> 0} eta * sfat_eta(H) = 0, H is privately learnable, thus establishing the first nonparametric private learnability guarantee for classes H with sfat_eta(H) diverging as eta -> 0. Our techniques involve a novel filtering procedure to output stable hypotheses for nonparametric function classes.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at COLT 2021 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd Characters remaining: 140

Similar Papers