English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

"I need a better description'': An Investigation Into User Expectations For Differential Privacy

MPS-Authors
/persons/resource/persons261045

Redmiles,  Elissa M.
Group K. Gummadi, Max Planck Institute for Software Systems, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)

arXiv:2110.06452.pdf
(Preprint), 592KB

Supplementary Material (public)
There is no public supplementary material available
Citation

Cummings, R., Kaptchuk, G., & Redmiles, E. M. (2021). "I need a better description'': An Investigation Into User Expectations For Differential Privacy. In Y. Kim, J. Kim, G. Vigna, E. Shi, H. Kim, & J. B. Hong (Eds.), CCS '21 (pp. 3037-3052). New York, NY: ACM. doi:10.1145/3460120.3485252.


Cite as: https://hdl.handle.net/21.11116/0000-0009-6EF2-B
Abstract
Despite recent widespread deployment of differential privacy, relatively
little is known about what users think of differential privacy. In this work,
we seek to explore users' privacy expectations related to differential privacy.
Specifically, we investigate (1) whether users care about the protections
afforded by differential privacy, and (2) whether they are therefore more
willing to share their data with differentially private systems. Further, we
attempt to understand (3) users' privacy expectations of the differentially
private systems they may encounter in practice and (4) their willingness to
share data in such systems. To answer these questions, we use a series of
rigorously conducted surveys (n=2424).
We find that users care about the kinds of information leaks against which
differential privacy protects and are more willing to share their private
information when the risks of these leaks are less likely to happen.
Additionally, we find that the ways in which differential privacy is described
in-the-wild haphazardly set users' privacy expectations, which can be
misleading depending on the deployment. We synthesize our results into a
framework for understanding a user's willingness to share information with
differentially private systems, which takes into account the interaction
between the user's prior privacy concerns and how differential privacy is
described.