Privacy Games

Yiling Chen, Or Sheffet, Salil Vadhan

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

The problem of analyzing the effect of privacy concerns on the behavior of selfish utility-maximizing agents has received much attention lately. Privacy concerns are often modeled by altering the utility functions of agents to consider also their privacy loss [4, 14, 20, 28]. Such privacy-aware agents prefer to take a randomized strategy even in very simple games in which non-privacy-aware agents play pure strategies. In some cases, the behavior of privacy-aware agents follows the framework of Randomized Response, a well-known mechanism that preserves differential privacy. Our work is aimed at better understanding the behavior of agents in settings where their privacy concerns are explicitly given. We consider a toy setting where agent A, in an attempt to discover the secret type of agent B, offers B a gift that one type of B agent likes and the other type dislikes. As opposed to previous works, B's incentive to keep her type a secret isn't the result of "hardwiring" B's utility function to consider privacy, but rather takes the form of a payment between B and A. We investigate three different types of payment functions and analyze B's behavior in each of the resulting games. As we show, under some payments, B's behavior is very different than the behavior of agents with hardwired privacy concerns and might even be deterministic. Under a different payment, we show that B's BNE strategy does fall into the framework of Randomized Response.

Original languageEnglish
Article number9
JournalACM Transactions on Economics and Computation
Volume8
Issue number2
DOIs
StatePublished - 1 May 2020

Bibliographical note

Publisher Copyright:
© 2020 ACM.

Funding

Previous versions of this article appear in the Proceedings of the 10th International Conference on Web and Internet Economics (WINE 2014), Beijing, China, and on arXiv:1410.1920 [cs.GT]. Y. Chen was Supported in part by NSF grant CCF-1301976. *The bulk of the work was done when the author was a postdoctoral fellow at Harvard University, supported in part by NSF grant CNS-1237235. S. Vadhan was Supported by NSF grant CNS-1237235, a gift from Google, Inc., and a Simons Investigator grant. Authors’ addresses: Y. Chen and S. Vadhan, School of Engineering and Applied Sciences, Harvard University, 33 Oxford St, Cambridge, MA 02138, USA; emails: [email protected], [email protected]; O. Sheffet, Faculty of Engineering, Bar-Ilan University, Ramat-Gan 52900, Israel; email: [email protected]. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]. © 2020 Copyright held by the owner/author(s). Publication rights licensed to ACM. 2167-8375/2020/05-ART9 $15.00 https://doi.org/10.1145/3381533

FundersFunder number
National Science FoundationCCF-1301976
Directorate for Computer and Information Science and Engineering1301976, 1237235
Harvard UniversityCNS-1237235

    Keywords

    • Bayes-Nash equilibrium
    • Differential privacy
    • privacy modeling

    Fingerprint

    Dive into the research topics of 'Privacy Games'. Together they form a unique fingerprint.

    Cite this