Sample Complexity Bounds on Differentially Private Learning via Communication Complexity

25 Feb 2014  ·  Vitaly Feldman, David Xiao ·

In this work we analyze the sample complexity of classification by differentially private algorithms. Differential privacy is a strong and well-studied notion of privacy introduced by Dwork et al. (2006) that ensures that the output of an algorithm leaks little information about the data point provided by any of the participating individuals. Sample complexity of private PAC and agnostic learning was studied in a number of prior works starting with (Kasiviswanathan et al., 2008) but a number of basic questions still remain open, most notably whether learning with privacy requires more samples than learning without privacy. We show that the sample complexity of learning with (pure) differential privacy can be arbitrarily higher than the sample complexity of learning without the privacy constraint or the sample complexity of learning with approximate differential privacy. Our second contribution and the main tool is an equivalence between the sample complexity of (pure) differentially private learning of a concept class $C$ (or $SCDP(C)$) and the randomized one-way communication complexity of the evaluation problem for concepts from $C$. Using this equivalence we prove the following bounds: 1. $SCDP(C) = \Omega(LDim(C))$, where $LDim(C)$ is the Littlestone's (1987) dimension characterizing the number of mistakes in the online-mistake-bound learning model. Known bounds on $LDim(C)$ then imply that $SCDP(C)$ can be much higher than the VC-dimension of $C$. 2. For any $t$, there exists a class $C$ such that $LDim(C)=2$ but $SCDP(C) \geq t$. 3. For any $t$, there exists a class $C$ such that the sample complexity of (pure) $\alpha$-differentially private PAC learning is $\Omega(t/\alpha)$ but the sample complexity of the relaxed $(\alpha,\beta)$-differentially private PAC learning is $O(\log(1/\beta)/\alpha)$. This resolves an open problem of Beimel et al. (2013b).

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here