Conservative or liberal? Personalized differential privacy

Zach Jorgensen, Ting Yu, Graham Cormode

Research output: Chapter in Book/Report/Conference proceedingConference contribution

47 Citations (Scopus)

Abstract

Differential privacy is widely accepted as a powerful framework for providing strong, formal privacy guarantees for aggregate data analysis. A limitation of the model is that the same level of privacy protection is afforded for all individuals. However, it is common that the data subjects have quite different expectations regarding the acceptable level of privacy for their data. Consequently, differential privacy may lead to insufficient privacy protection for some users, while over-protecting others. We argue that by accepting that not all users require the same level of privacy, a higher level of utility can often be attained by not providing excess privacy to those who do not want it. We propose a new privacy definition called personalized differential privacy (PDP), a generalization of differential privacy in which users specify a personal privacy requirement for their data. We then introduce several novel mechanisms for achieving PDP. Our primary mechanism is a general one that automatically converts any existing differentially private algorithm into one that satisfies PDP. We also present a more direct approach for achieving PDP, inspired by the well-known exponential mechanism. We demonstrate our framework through extensive experiments on real and synthetic data.

Original languageEnglish
Title of host publicationProceedings - International Conference on Data Engineering
PublisherIEEE Computer Society
Pages1023-1034
Number of pages12
Volume2015-May
ISBN (Print)9781479979639
DOIs
Publication statusPublished - 26 May 2015
Event2015 31st IEEE International Conference on Data Engineering, ICDE 2015 - Seoul, Korea, Republic of
Duration: 13 Apr 201517 Apr 2015

Other

Other2015 31st IEEE International Conference on Data Engineering, ICDE 2015
CountryKorea, Republic of
CitySeoul
Period13/4/1517/4/15

Fingerprint

Experiments

ASJC Scopus subject areas

  • Information Systems
  • Signal Processing
  • Software

Cite this

Jorgensen, Z., Yu, T., & Cormode, G. (2015). Conservative or liberal? Personalized differential privacy. In Proceedings - International Conference on Data Engineering (Vol. 2015-May, pp. 1023-1034). [7113353] IEEE Computer Society. https://doi.org/10.1109/ICDE.2015.7113353

Conservative or liberal? Personalized differential privacy. / Jorgensen, Zach; Yu, Ting; Cormode, Graham.

Proceedings - International Conference on Data Engineering. Vol. 2015-May IEEE Computer Society, 2015. p. 1023-1034 7113353.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Jorgensen, Z, Yu, T & Cormode, G 2015, Conservative or liberal? Personalized differential privacy. in Proceedings - International Conference on Data Engineering. vol. 2015-May, 7113353, IEEE Computer Society, pp. 1023-1034, 2015 31st IEEE International Conference on Data Engineering, ICDE 2015, Seoul, Korea, Republic of, 13/4/15. https://doi.org/10.1109/ICDE.2015.7113353
Jorgensen Z, Yu T, Cormode G. Conservative or liberal? Personalized differential privacy. In Proceedings - International Conference on Data Engineering. Vol. 2015-May. IEEE Computer Society. 2015. p. 1023-1034. 7113353 https://doi.org/10.1109/ICDE.2015.7113353
Jorgensen, Zach ; Yu, Ting ; Cormode, Graham. / Conservative or liberal? Personalized differential privacy. Proceedings - International Conference on Data Engineering. Vol. 2015-May IEEE Computer Society, 2015. pp. 1023-1034
@inproceedings{f13120a1ce87476895b3f737813d7ab0,
title = "Conservative or liberal? Personalized differential privacy",
abstract = "Differential privacy is widely accepted as a powerful framework for providing strong, formal privacy guarantees for aggregate data analysis. A limitation of the model is that the same level of privacy protection is afforded for all individuals. However, it is common that the data subjects have quite different expectations regarding the acceptable level of privacy for their data. Consequently, differential privacy may lead to insufficient privacy protection for some users, while over-protecting others. We argue that by accepting that not all users require the same level of privacy, a higher level of utility can often be attained by not providing excess privacy to those who do not want it. We propose a new privacy definition called personalized differential privacy (PDP), a generalization of differential privacy in which users specify a personal privacy requirement for their data. We then introduce several novel mechanisms for achieving PDP. Our primary mechanism is a general one that automatically converts any existing differentially private algorithm into one that satisfies PDP. We also present a more direct approach for achieving PDP, inspired by the well-known exponential mechanism. We demonstrate our framework through extensive experiments on real and synthetic data.",
author = "Zach Jorgensen and Ting Yu and Graham Cormode",
year = "2015",
month = "5",
day = "26",
doi = "10.1109/ICDE.2015.7113353",
language = "English",
isbn = "9781479979639",
volume = "2015-May",
pages = "1023--1034",
booktitle = "Proceedings - International Conference on Data Engineering",
publisher = "IEEE Computer Society",

}

TY - GEN

T1 - Conservative or liberal? Personalized differential privacy

AU - Jorgensen, Zach

AU - Yu, Ting

AU - Cormode, Graham

PY - 2015/5/26

Y1 - 2015/5/26

N2 - Differential privacy is widely accepted as a powerful framework for providing strong, formal privacy guarantees for aggregate data analysis. A limitation of the model is that the same level of privacy protection is afforded for all individuals. However, it is common that the data subjects have quite different expectations regarding the acceptable level of privacy for their data. Consequently, differential privacy may lead to insufficient privacy protection for some users, while over-protecting others. We argue that by accepting that not all users require the same level of privacy, a higher level of utility can often be attained by not providing excess privacy to those who do not want it. We propose a new privacy definition called personalized differential privacy (PDP), a generalization of differential privacy in which users specify a personal privacy requirement for their data. We then introduce several novel mechanisms for achieving PDP. Our primary mechanism is a general one that automatically converts any existing differentially private algorithm into one that satisfies PDP. We also present a more direct approach for achieving PDP, inspired by the well-known exponential mechanism. We demonstrate our framework through extensive experiments on real and synthetic data.

AB - Differential privacy is widely accepted as a powerful framework for providing strong, formal privacy guarantees for aggregate data analysis. A limitation of the model is that the same level of privacy protection is afforded for all individuals. However, it is common that the data subjects have quite different expectations regarding the acceptable level of privacy for their data. Consequently, differential privacy may lead to insufficient privacy protection for some users, while over-protecting others. We argue that by accepting that not all users require the same level of privacy, a higher level of utility can often be attained by not providing excess privacy to those who do not want it. We propose a new privacy definition called personalized differential privacy (PDP), a generalization of differential privacy in which users specify a personal privacy requirement for their data. We then introduce several novel mechanisms for achieving PDP. Our primary mechanism is a general one that automatically converts any existing differentially private algorithm into one that satisfies PDP. We also present a more direct approach for achieving PDP, inspired by the well-known exponential mechanism. We demonstrate our framework through extensive experiments on real and synthetic data.

UR - http://www.scopus.com/inward/record.url?scp=84940878770&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84940878770&partnerID=8YFLogxK

U2 - 10.1109/ICDE.2015.7113353

DO - 10.1109/ICDE.2015.7113353

M3 - Conference contribution

AN - SCOPUS:84940878770

SN - 9781479979639

VL - 2015-May

SP - 1023

EP - 1034

BT - Proceedings - International Conference on Data Engineering

PB - IEEE Computer Society

ER -