no code implementations • 2 May 2024 • Wei-Ning Chen, Berivan Isik, Peter Kairouz, Albert No, Sewoong Oh, Zheng Xu
We study $L_2$ mean estimation under central differential privacy and communication constraints, and address two key challenges: firstly, existing mean estimation schemes that simultaneously handle both constraints are usually optimized for $L_\infty$ geometry and rely on random rotation or Kashin's representation to adapt to $L_2$ geometry, resulting in suboptimal leading constants in mean square errors (MSEs); secondly, schemes achieving order-optimal communication-privacy trade-offs do not extend seamlessly to streaming differential privacy (DP) settings (e. g., tree aggregation or matrix factorization), rendering them incompatible with DP-FTRL type optimizers.
no code implementations • 15 Jun 2023 • Daria Reshetova, Wei-Ning Chen, Ayfer Özgür
Local differential privacy is a powerful method for privacy-preserving data collection.
no code implementations • 9 Jul 2022 • Wei-Ning Chen, Ayfer Özgür, Peter Kairouz
Unlike previous discrete DP schemes based on additive noise, our mechanism encodes local information into a parameter of the binomial distribution, and hence the output distribution is discrete with bounded support.
no code implementations • 7 Mar 2022 • Wei-Ning Chen, Christopher A. Choquette-Choo, Peter Kairouz, Ananda Theertha Suresh
We consider the problem of training a $d$ dimensional model with distributed differential privacy (DP) where secure aggregation (SecAgg) is used to ensure that the server only sees the noisy sum of $n$ model updates in every training round.
no code implementations • 29 Oct 2021 • Abhin Shah, Wei-Ning Chen, Johannes Balle, Peter Kairouz, Lucas Theis
Compressing the output of \epsilon-locally differentially private (LDP) randomizers naively leads to suboptimal utility.
no code implementations • 16 Jun 2021 • Wei-Ning Chen, Peter Kairouz, Ayfer Özgür
For the interactive setting, we propose a novel tree-based estimation scheme and show that the minimum sample-size needed to achieve dimension-free convergence can be further reduced to $n^*(s, d, b) = \tilde{O}\left( {s^2\log^2 d}/{2^b} \right)$.
no code implementations • NeurIPS 2020 • Wei-Ning Chen, Peter Kairouz, Ayfer Özgür
In particular, we consider the problems of mean estimation and frequency estimation under $\varepsilon$-local differential privacy and $b$-bit communication constraints.
no code implementations • 21 May 2020 • Leighton Pate Barnes, Wei-Ning Chen, Ayfer Ozgur
We develop data processing inequalities that describe how Fisher information from statistical samples can scale with the privacy parameter $\varepsilon$ under local differential privacy constraints.