Differential Privacy for Class-Based Data: A Practical Gaussian MechanismShow others and affiliations
2023 (English)In: IEEE Transactions on Information Forensics and Security, ISSN 1556-6013, E-ISSN 1556-6021, Vol. 18, p. 5096-5108Article in journal (Refereed) Published
Abstract [en]
In this paper, we present a notion of differential privacy (DP) for data that comes from different classes. Here, the class-membership is private information that needs to be protected. The proposed method is an output perturbation mechanism that adds noise to the release of query response such that the analyst is unable to infer the underlying class-label. The proposed DP method is capable of not only protecting the privacy of class-based data but also meets quality metrics of accuracy and is computationally efficient and practical. We illustrate the efficacy of the proposed method empirically while outperforming the baseline additive Gaussian noise mechanism. We also examine a real-world application and apply the proposed DP method to the autoregression and moving average (ARMA) forecasting method, protecting the privacy of the underlying data source. Case studies on the real-world advanced metering infrastructure (AMI) measurements of household power consumption validate the excellent performance of the proposed DP method while also satisfying the accuracy of forecasted power consumption measurements.
Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE) , 2023. Vol. 18, p. 5096-5108
Keywords [en]
autoregression and moving average, class-based privacy, Differential privacy, Gaussian mechanism, smart meter data
National Category
Computer Sciences Probability Theory and Statistics
Identifiers
URN: urn:nbn:se:kth:diva-338557DOI: 10.1109/TIFS.2023.3289128Scopus ID: 2-s2.0-85163446476OAI: oai:DiVA.org:kth-338557DiVA, id: diva2:1810373
Note
QC 20231107
2023-11-072023-11-072023-11-07Bibliographically approved