English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Paper

What Are the Chances? Explaining the Epsilon Parameter in Differential Privacy

MPS-Authors
/persons/resource/persons261045

Redmiles,  Elissa M.
Group K. Gummadi, Max Planck Institute for Software Systems, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)

arXiv:2303.00738.pdf
(Preprint), 855KB

Supplementary Material (public)
There is no public supplementary material available
Citation

Nanayakkara, P., Smart, M. A., Cummings, R., Kaptchuk, G., & Redmiles, E. M. (2023). What Are the Chances? Explaining the Epsilon Parameter in Differential Privacy. Retrieved from https://arxiv.org/abs/2303.00738.


Cite as: https://hdl.handle.net/21.11116/0000-000D-0DA7-9
Abstract
Differential privacy (DP) is a mathematical privacy notion increasingly
deployed across government and industry. With DP, privacy protections are
probabilistic: they are bounded by the privacy budget parameter, $\epsilon$.
Prior work in health and computational science finds that people struggle to
reason about probabilistic risks. Yet, communicating the implications of
$\epsilon$ to people contributing their data is vital to avoiding privacy
theater -- presenting meaningless privacy protection as meaningful -- and
empowering more informed data-sharing decisions. Drawing on best practices in
risk communication and usability, we develop three methods to convey
probabilistic DP guarantees to end users: two that communicate odds and one
offering concrete examples of DP outputs.
We quantitatively evaluate these explanation methods in a vignette survey
study ($n=963$) via three metrics: objective risk comprehension, subjective
privacy understanding of DP guarantees, and self-efficacy. We find that
odds-based explanation methods are more effective than (1) output-based methods
and (2) state-of-the-art approaches that gloss over information about
$\epsilon$. Further, when offered information about $\epsilon$, respondents are
more willing to share their data than when presented with a state-of-the-art DP
explanation; this willingness to share is sensitive to $\epsilon$ values: as
privacy protections weaken, respondents are less likely to share data.