Differential privacy is an essential and prevalent privacy model that has been widely explored in recent decades. This survey provides a comprehensive and structured overview of two research ...directions: differentially private data publishing and differentially private data analysis. We compare the diverse release mechanisms of differentially private data publishing given a variety of input data in terms of query type, the maximum number of queries, efficiency, and accuracy. We identify two basic frameworks for differentially private data analysis and list the typical algorithms used within each framework. The results are compared and discussed based on output accuracy and efficiency. Further, we propose several possible directions for future research and possible applications.
To protect individuals' location privacy, an important privacy protection technique that can be used is Formula Omitted-anonymity, which requires at least Formula Omitted users to participate in an ...anonymity set, so that any user in the set cannot be distinguished from the other Formula Omitted users. However, a significant part of users may not be concerned about their location privacy and therefore may not be interested in participating in the anonymity set. Hence, a prerequisite for achieving Formula Omitted-anonymity location privacy is to stimulate users to participate. In this paper, we revisit the problem of stimulating users that are privacy-indifferent to participate in the anonymity set and providing Formula Omitted-anonymity location privacy for privacy-sensitive users. We first study the case where all privacy-sensitive users have the same requirement of privacy. Then, we extend our study to a more general setting, where privacy-sensitive users have different requirements. For both cases, we design auction-based mechanisms and rigorously prove that the mechanisms are truthful. More importantly, our mechanisms can achieve higher satisfaction ratio than the existing work, i.e., our mechanisms greatly increase the number of privacy-sensitive users successfully winning the auction and receiving privacy protection. We evaluate our mechanisms by using extensive numerical experiments and simulations on a real-world data set. Evaluation results show that our mechanisms achieve much better performance regarding the satisfaction ratio compared with the state-of-the-art mechanisms, and that the computational efficiency is good.
In the research of location privacy protection, the existing methods are mostly based on the traditional anonymization, fuzzy and cryptography technology, and little success in the big data ...environment, for example, the sensor networks contain sensitive information, which is compulsory to be appropriately protected. Current trends, such as "Industrie 4.0" and Internet of Things (IoT), generate, process, and exchange vast amounts of security-critical and privacy-sensitive data, which makes them attractive targets of attacks. However, previous methods overlooked the privacy protection issue, leading to privacy violation. In this paper, we propose a location privacy protection method that satisfies differential privacy constraint to protect location data privacy and maximizes the utility of data and algorithm in Industrial IoT. In view of the high value and low density of location data, we combine the utility with the privacy and build a multilevel location information tree model. Furthermore, the index mechanism of differential privacy is used to select data according to the tree node accessing frequency. Finally, the Laplace scheme is used to add noises to accessing frequency of the selecting data. As is shown in the theoretical analysis and the experimental results, the proposed strategy can achieve significant improvements in terms of security, privacy, and applicability.
To achieve automatic recommendation of privacy settings for image sharing, a new tool called iPrivacy (image privacy) is developed for releasing the burden from users on setting the privacy ...preferences when they share their images for special moments. Specifically, this paper consists of the following contributions: 1) massive social images and their privacy settings are leveraged to learn the object-privacy relatedness effectively and identify a set of privacy-sensitive object classes automatically; 2) a deep multi-task learning algorithm is developed to jointly learn more representative deep convolutional neural networks and more discriminative tree classifier, so that we can achieve fast and accurate detection of large numbers of privacy-sensitive object classes; 3) automatic recommendation of privacy settings for image sharing can be achieved by detecting the underlying privacy-sensitive objects from the images being shared, recognizing their classes, and identifying their privacy settings according to the object-privacy relatedness; and 4) one simple solution for image privacy protection is provided by blurring the privacy-sensitive objects automatically. We have conducted extensive experimental studies on real-world images and the results have demonstrated both the efficiency and effectiveness of our proposed approach.
Existing research on information privacy has mostly relied on the privacy calculus model, which views privacy‐related decision‐making as a rational process where individuals weigh the anticipated ...risks of disclosing personal data against the potential benefits. In this research, we develop an extension to the privacy calculus model, arguing that the situation‐specific assessment of risks and benefits is bounded by (1) pre‐existing attitudes or dispositions, such as general privacy concerns or general institutional trust, and (2) limited cognitive resources and heuristic thinking. An experimental study, employing two samples from the USA and Switzerland, examined consumer responses to a new smartphone application that collects driving behavior data and provided converging support for these predictions. Specifically, the results revealed that a situation‐specific assessment of risks and benefits fully mediates the effect of dispositional factors on information disclosure. In addition, the results showed that privacy assessment is influenced by momentary affective states, indicating that consumers underestimate the risks of information disclosure when confronted with a user interface that elicits positive affect.
Ever since empirical studies found only a weak, if any, relationship between privacy concerns and privacy behavior, scholars have struggled to explain the so-called privacy paradox. Today, a number ...of theoretical arguments illuminate users’ privacy rationales, including the privacy calculus, privacy literacy, and contextual differentiations. A recent approach focuses on user resignation, apathy, or fatigue. In this piece, we concentrate on privacy cynicism, an attitude of uncertainty, powerlessness, mistrust, and resignation toward data handling by online services that renders privacy protection subjectively futile. We discuss privacy cynicism in the context of data capitalism, as a coping mechanism to address the tension between digital inclusion and a desire for privacy. Moreover, we introduce a measure for privacy cynicism and investigate the phenomenon based on a large-scale survey in Germany. The analysis highlights the multidimensionality of the construct, differentiating its relationships with privacy concerns, threat experience, Internet skills, and protection behavior.