Linear-regression on packed encrypted data in the two-server model

Adi Akavia, Hayim Shaul, Mor Weiss, Zohar Yakhini

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

11 Scopus citations

Abstract

Developing machine learning models from federated training data, containing many independent samples, is an important task that can significantly enhance the potential applicability and prediction power of learned models. Since single users, like hospitals or individual labs, typically collect data-sets that do not support accurate learning with high confidence, it is desirable to combine data from several users without compromising data privacy. In this paper, we develop a privacy-preserving solution for learning a linear regression model from data collectively contributed by several parties (“data owners”). Our protocol is based on the protocol of Giacomelli et al. (ACNS 2018) that utilized two non colluding servers and Linearly Homomorphic Encryption (LHE) to learn regularized linear regression models. Our methods use a different LHE scheme that allows us to significantly reduce both the number and runtime of homomorphic operations, as well as the total runtime complexity. Another advantage of our protocol is that the underlying LHE scheme is based on a different (and post-quantum secure) security assumption than Giacomelli et al. Our approach leverages the Chinese Remainder Theorem, and Single Instruction Multiple Data representations, to obtain our improved performance. For a 1000 x 40 linear regression task we can learn a model in a total of 3 seconds for the homomorphic operations, compared to more than 100 seconds reported in the literature. Our approach also scales up to larger feature spaces: we implemented a system that can handle a 1000 x 100 linear regression task, investing minutes of server computing time after a more significant offline pre-processing by the data owners. We intend to incorporate our protocol and implementations into a comprehensive system that can handle secure federated learning at larger scales.

Original languageEnglish
Title of host publicationWAHC 2019 - Proceedings of the 7th ACM Workshop on Encrypted Computing and Applied Homomorphic Cryptography
PublisherAssociation for Computing Machinery
Pages21-32
Number of pages12
ISBN (Electronic)9781450368292
DOIs
StatePublished - 11 Nov 2019
Externally publishedYes
Event7th Workshop on Encrypted Computing and Applied Homomorphic Cryptography, WAHC 2019, co-located with the 26th ACM Conference on Computer and Communications Security, CCS 2019 - London, United Kingdom
Duration: 11 Nov 2019 → …

Publication series

NameProceedings of the ACM Conference on Computer and Communications Security
ISSN (Print)1543-7221

Conference

Conference7th Workshop on Encrypted Computing and Applied Homomorphic Cryptography, WAHC 2019, co-located with the 26th ACM Conference on Computer and Communications Security, CCS 2019
Country/TerritoryUnited Kingdom
CityLondon
Period11/11/19 → …

Bibliographical note

Publisher Copyright:
© 2019 Copyright held by the owner/author(s).

Fingerprint

Dive into the research topics of 'Linear-regression on packed encrypted data in the two-server model'. Together they form a unique fingerprint.

Cite this