Data from the Internet of Things (IoT) enables the design of new business models and services that improve user experience and satisfaction. These data serve as important information sources for many ...domains, including disaster management, biosurveillance, smart cities, and smart health, among others. However, this scenario involves the collection of personal data, raising new challenges related to data privacy protection. Therefore, we aim to provide state-of-the-art information regarding privacy issues in the context of IoT, with a particular focus on findings that utilize the Personal Data Store (PDS) as a viable solution for these concerns. To achieve this, we conduct a systematic mapping review to identify, evaluate, and interpret the relevant literature on privacy issues and PDS-based solutions in the IoT context. Our analysis is guided by three well-defined research questions, and we systematically selected 49 studies published until 2023 from an initial pool of 176 papers. We analyze and discuss the most common privacy issues highlighted by the authors and position the role of PDS technologies as a solution to privacy issues in the IoT context. As a result, our findings reveal that only a small number of works (approximately 20%) were dedicated to presenting solutions for privacy issues. Most works (almost 82%) were published between 2018 and 2023, demonstrating an increased interest in the theme in recent years. Additionally, only two works used PDS-based solutions to deal with privacy issues in the IoT context.
People’s encounters and entanglements with the personal digital data that they generate is a new and compelling area of research interest in this age of the ascendancy of digital data. Masses of ...personal information are constantly generated via people’s use of digital technologies and used for a variety of purposes by a range of actors. People are faced with the conundrum of how to interpret, control and make sense of their lively data. In this article, I explore the topic of how personal digital data and their circulations can be made more perceptible and therefore interpretable to people with the use of three-dimensional materialisations. These materialisations invite users to ‘feel your data’. As I show, ‘feeling your data’ has two meanings: the sensations of touching these three-dimensional objects and the visceral responses that are generated from these and other sensory encounters with data.
As regulators began prohibiting online platforms from collecting personal data based on the “take-it-or-leave-it” basis, platform firms must adopt more refined user consent rules such as the ...pay-or-consent approach. Ensuring sufficient user options could increase the welfare of privacy-sensitive users but reduce the efficiency of data-driven business models. To balance the benefits and costs of enhanced privacy protection, regulators should understand the diversity in users' attitudes toward behavioral data collection in free online platforms. Tradeoffs among privacy, conveniences, and free services based on users' heterogeneous preferences are considered to investigate the user's different privacy attitudes in free online platforms. Three distinct user groups were found: the first one reluctantly accepts the “take-it-or-leave-it” condition because of the lack of alternatives, the second one accepts it for free services, and the third one accepts it because it does not matter. These three user segments constituted 32.9%, 47.0%, and 20.1% of all the respondents, respectively. The pay-or-consent approach can be justifiable in terms of balancing the benefits and costs of the privacy regulations if it properly reflects privacy-sensitive users' willingness to pay for privacy.
•Ensuring sufficient user options can increase or decrease user welfare.•Enforcing identical user consent conditions for all users is inappropriate.•Three user segments in terms of privacy attitudes in free online platforms are identified.•Three user segments constituted 32.9%, 47.0%, and 20.1% of all the respondents.•Ensuring proper tradeoffs based on users' different privacy attitudes is suggested.
Decisions based on algorithmic, machine learning models can be unfair, reproducing biases in historical data used to train them. While computational techniques are emerging to address aspects of ...these concerns through communities such as discrimination-aware data mining (DADM) and fairness, accountability and transparency machine learning (FATML), their practical implementation faces real-world challenges. For legal, institutional or commercial reasons, organisations might not hold the data on sensitive attributes such as gender, ethnicity, sexuality or disability needed to diagnose and mitigate emergent indirect discrimination-by-proxy, such as redlining. Such organisations might also lack the knowledge and capacity to identify and manage fairness issues that are emergent properties of complex sociotechnical systems. This paper presents and discusses three potential approaches to deal with such knowledge and information deficits in the context of fairer machine learning. Trusted third parties could selectively store data necessary for performing discrimination discovery and incorporating fairness constraints into model-building in a privacy-preserving manner. Collaborative online platforms would allow diverse organisations to record, share and access contextual and experiential knowledge to promote fairness in machine learning systems. Finally, unsupervised learning and pedagogically interpretable algorithms might allow fairness hypotheses to be built for further selective testing and exploration. Real-world fairness challenges in machine learning are not abstract, constrained optimisation problems, but are institutionally and contextually grounded. Computational fairness tools are useful, but must be researched and developed in and with the messy contexts that will shape their deployment, rather than just for imagined situations. Not doing so risks real, near-term algorithmic harm.
This article analyses, defines, and refines the concepts of ownership and personal data to explore their compatibility in the context of EU law. It critically examines the traditional dividing line ...between personal and non-personal data and argues for a strict conceptual separation of personal data from personal information. The article also considers whether, and to what extent, the concept of ownership can be applied to personal data in the context of the Internet of Things (IoT). This consideration is framed around two main approaches shaping all ownership theories: a bottom-up and top-down approach. Via these dual lenses, the article reviews existing debates relating to four elements supporting introduction of ownership of personal data, namely the elements of control, protection, valuation, and allocation of personal data. It then explores the explanatory advantages and disadvantages of the two approaches in relation to each of these elements as well as to ownership of personal data in IoT at large. Lastly, this article outlines a revised approach to ownership of personal data in IoT that may serve as a blueprint for future work in this area and inform regulatory and policy debates.
In the era of data-driven communication, managing one’s online privacy is a necessary, yet burdensome challenge. While individuals have concerns about firms’ data collection practices, they sometimes ...appear to disclose personal information for relatively small rewards. We demonstrate that privacy cynicism—an attitude toward privacy protection characterized by frustration, hopelessness, and disillusionment—explains this paradox by moderating the relationship between the appraisal of privacy threats and privacy coping behaviors on one side, and privacy protection behaviors on the other side. Results of a U.S. national survey (N = 993) show that privacy cynicism is negatively related to privacy protection behaviors and significantly moderates relationships of perceived vulnerability, response efficacy, disclosure benefits, and response costs on protection behaviors. Hence, this work has important implications for communication theory by extending existing models of privacy management behaviors, as well as for communication practice, by stressing the importance of creating awareness about privacy cynicism.
Konstytucja Rzeczypospolitej Polskiej sytuuje obowiązki spoczywające na organach władzy publicznej w granicach, w jakich mogą one przetwarzać dane osobowe, co jest niewątpliwym wzmocnieniem praw ...jednostki. Konstytucyjne ograniczenie zostało wyznaczone zarówno przez zasadę legalizmu, jak i zasadę demokratycznego państwa prawnego. Ustalenie administratora danych osobowych, a także inspektora ochrony danych osobowych w związku z postępowaniem trybunalskim, staje się istotne z punktu podmiotu, którego to dane są przetwarzane. Określenie Trybunału Stanu jako administratora danych osobowych nowelą ustawy o Trybuna le Stanu prowadzi do przypisania temu organowi licznych obowiązków wynikających z rozporządzenia Parlamentu Europejskiego i Rady (UE) 2016/679 z 27.04.2016 r. w sprawie ochrony osób fizycznych w związku z przetwarzaniem danych osobowych i w sprawie swobodnego przepływu takich danych oraz uchylenia dyrektywy 95/46/WE. Działając jako organ upoważniony do przetwarzania danych osobowych, Sąd odpowiada także za zgodność ich przetwarzania z prawem. Trybunał Stanu jest gwarantem realizacji praw i wolności jednostek, których to dane są przetwarzane w ramach prowadzonych przez niego działań.