Learning the parameters of Gaussian mixture models is a fundamental and widely studied problem with numerous applications. In this work, we give new algorithms for learning the parameters of a high-dimensional, well separated, Gaussian mixture model subject to the strong constraint of differential privacy. In particular, we give a differentially private analogue of the algorithm of Achlioptas and McSherry (COLT 2005). Our algorithm has two key properties not achieved by prior work: (1) The algorithm's sample complexity matches that of the corresponding non-private algorithm up to lower order terms in a wide range of parameters. (2) The algorithm requires very weak a priori bounds on the parameters of the mixture.
|Journal||Advances in Neural Information Processing Systems|
|State||Published - 2019|
|Event||33rd Annual Conference on Neural Information Processing Systems, NeurIPS 2019 - Vancouver, Canada|
Duration: 8 Dec 2019 → 14 Dec 2019
Bibliographical noteFunding Information:
Part of this work was done while the authors were visiting the Simons Institute for Theoretical Computer Science. Parts of this work were done while GK was supported as a Microsoft Research Fellow, as part of the Simons-Berkeley Research Fellowship program, while visiting Microsoft Research, Redmond, and while supported by a University of Waterloo startup grant. This work was done while OS was affiliated with the University of Alberta. OS gratefully acknowledges the Natural Sciences and Engineering Research Council of Canada (NSERC) for its support through grant #2017-06701. JU and VS were supported by NSF grants CCF-1718088, CCF-1750640, and CNS-1816028.
© 2019 Neural information processing systems foundation. All rights reserved.