Abstract
We introduce three novel differentially private algorithms that approximate the 2nd-moment matrix of the data. These algorithms, which in contrast to existing algorithms always output positive-definite matrices, correspond to existing techniques in linear regression literature. Thus these techniques have an immediate interpretation and all results known about these techniques are straight-forwardly applicable to the outputs of these algorithms. More specifically, we discuss the following three techniques. (i) For Ridge Regression, we propose setting the regularization coefficient so that by approximating the solution using Johnson-Lindenstrauss transform we preserve privacy. (ii) We show that adding a batch of d + O(∊−2) random samples to our data preserves differential privacy. (iii) We show that sampling the 2nd-moment matrix from a Bayesian posterior inverse-Wishart distribution is differentially private. We also give utility bounds for our algorithms and compare them with the existing “Analyze Gauss” algorithm of Dwork et al (2014).
| Original language | English |
|---|---|
| Pages (from-to) | 789-827 |
| Number of pages | 39 |
| Journal | Proceedings of Machine Learning Research |
| Volume | 98 |
| State | Published - 2019 |
| Externally published | Yes |
| Event | 30th International Conference on Algorithmic Learning Theory, ALT 2019 - Chicago, United States Duration: 22 Mar 2019 → 24 Mar 2019 |
Bibliographical note
Publisher Copyright:© 2019 Proceedings of Machine Learning Research. All rights reserved.
Funding
∗ We gratefully acknowledge the Natural Sciences and Engineering Research Council of Canada (NSERC) for supporting O.S. with grant #2017–06701; O.S. is also an unpaid collaborator on NSF grant #1565387.
| Funders | Funder number |
|---|---|
| National Science Foundation | 1565387 |
| Natural Sciences and Engineering Research Council of Canada | 2017–06701 |
Keywords
- Differential Privacy
- Linear Regression
- Second-Moment Matrix
- Wishart Distribution