The EU’s General Data Protection Regulation (GDPR) has recently come into effect and insofar as Internet of Things (IoT) applications touch EU citizens or their data, developers are obliged to ...exercise due diligence and ensure they undertake Data Protection by Design and Default (DPbD). GDPR mandates the use of Data Protection Impact Assessments (DPIAs) as a key heuristic enabling DPbD. However, research has shown that developers generally lack the competence needed to deal effectively with legal aspects of privacy management and that the difficulties of complying with regulation are likely to grow considerably. Privacy engineering seeks to shift the focus from interpreting texts and guidelines or consulting legal experts to embedding data protection within the development process itself. There are, however, few examples in practice. We present a privacy-oriented, flow-based integrated development environment (IDE) for building domestic IoT applications. The IDE enables due diligence in (a) helping developers reason about personal data during the actual in vivo construction of IoT applications; (b) advising developers as to whether or not the design choices they are making occasion the need for a DPIA; and (c) attaching and making available to others (including data processors, data controllers, data protection officers, users and supervisory authorities) specific privacy-related information that has arisen during an application’s development.
In this paper we present findings of research into Trust, specifically within the context of Autonomous Systems. The research is based upon an exploratory workshop attended by domain experts from ...academia and industry. The aim of the work is to synthesise interdisciplinary and high-level understandings of pertinent issues into a singular and cohesive Master Narrative relating to Trust an Autonomous Systems. The inquiry constructs a Master Narrative that casts Trust as a notion that is necessarily constructed by complex relationships, disciplinary lenses, and multiple concurrent stakeholders. We term this ‘Trust as a Distributed Concern’. The paper describes the research and analysis which underpins the concept of Trust as a Distributed Concern and discusses how the concept may be operationalised in research and innovation contexts.
The focus of this paper is upon how people handle the sharing of personal data as an interactional concern. A number of ethnographic studies of domestic environments are drawn upon in order to ...articulate a range of circumstances under which data may be shared. In particular, a distinction is made between the in situ sharing of data with others around you and the sharing of data with remote parties online. A distinction is also drawn between circumstances of purposefully sharing data in some way and circumstances where the sharing of data is incidental or even unwitting. On the basis of these studies, a number of the organisational features of how people seek to manage the ways in which their data is shared are teased out. The paper then reflects upon how data sharing practices have evolved to handle the increasing presence of digital systems in people’s environments and how these relate to the ways in which people traditionally orient to the sharing of information. In conclusion, a number of ways are pointed out in which the sharing of data remains problematic and there is a discussion of how systems may need to adapt to better support people’s data sharing practices in the future.
This series of studies make it clear that a wide range of both physical and digital resources are involved in domestic music consumption. The selection of digital resources is particularly evident, ...and it can be observed that domestic music consumption is a
fragmented
business, taking advantage of many different “channels” for getting, using and preparing music. While there are not a series of common channels, each home displayed a variety of methods in respect to using metadata in multiple different modalities: regardless, the activities involved in getting, using and preparing music
cohere
through a noticeable, emergent set of
workflows
. We find that not only does metadata support searching, as one might expect, but also it pervades all parts of the workflow and is used in real-time as a reflexive artifact and in terms of its future perceived/prescribed use. The findings of the research raise a series of possibilities and issues that form the basis for understanding and designing for metadata use.
Repacking ‘Privacy’ for a Networked World Crabtree, Andy; Tolmie, Peter; Knight, Will
Computer supported cooperative work,
2017/12, Letnik:
26, Številka:
4-6
Journal Article
Recenzirano
Odprti dostop
In this paper we examine the notion of privacy as promoted in the digital economy and how it has been taken up as a design challenge in the fields of CSCW, HCI and Ubiquitous Computing. Against these ...prevalent views we present an ethnomethodological study of digital privacy practices in 20 homes in the UK and France, concentrating in particular upon people’s use of passwords, their management of digital content, and the controls they exercise over the extent to which the online world at large can penetrate their everyday lives. In explicating digital privacy practices in the home we find an abiding methodological concern amongst members to manage the potential ‘attack surface’
of
the digital
on
everyday life occasioned by interaction in and with the networked world. We also find, as a feature of this methodological preoccupation, that privacy dissolves into a heterogeneous array of relationship management practices. Accordingly we propose that ‘privacy’ has little utility as a focus for design, and suggest instead that a more productive way forward would be to concentrate on supporting people’s evident interest in managing their relationships in and with the networked world.
Abstract The risks AI presents to society are broadly understood to be manageable through ‘general calculus’, i.e., general frameworks designed to enable those involved in the development of AI to ...apprehend and manage risk, such as AI impact assessments, ethical frameworks, emerging international standards, and regulations. This paper elaborates how risk is apprehended and managed by a regulator, developer and cyber-security expert. It reveals that risk and risk management is dependent on mundane situated practices not encapsulated in general calculus. Situated practice surfaces ‘iterable epistopics’, revealing how those involved in the development of AI know and subsequently respond to risk and uncover major challenges in their work. The ongoing discovery and elaboration of epistopics of risk in AI (a) furnishes a potential program of interdisciplinary inquiry, (b) provides AI developers with a means of apprehending risk, and (c) informs the ongoing evolution of general calculus.
The home is a site marked by the increasing collection and use of personal data, whether online or from connected devices. This trend is accompanied by new data protection regulation and the ...development of privacy enhancing technologies (PETs) that seek to enable individual control over the processing of personal data. However, a great deal of the data generated within the connected home is interpersonal in nature and cannot therefore be attributed to an individual. The cardboard box study adapts the technology probe approach to explore with potential end users the salience of a PET called the Databox and to understand the challenge of collaborative rather than individual data management in the home. The cardboard box study was designed as an ideation card game and conducted with 22 households distributed around the UK, providing us with 38 participants. Demographically, our participants were of varying ages and had a variety of occupational backgrounds and differing household situations. The study makes it perspicuous that privacy is not a ubiquitous concern
within the home
as a great deal of data is shared by default of people living together; that when privacy is occasioned it performs a distinct social function that is concerned with
human security
and the safety and integrity of people rather than devices and data; and that current ‘interdependent privacy’ solutions that seek to support collaborative data management are
not well aligned
with the ways access control is negotiated and managed within the home.
The domestic environment is a key area for the design and deployment of autonomous systems. Yet research indicates their adoption is already being hampered by a variety of critical issues including ...trust, privacy and security. This paper explores how potential users relate to the concept of autonomous systems in the home and elaborates further points of friction. It makes two contributions. The first one is of a methodological nature and focuses on the use of provocative utopian and dystopian scenarios of future autonomous systems in the home. These are used to drive an innovative workshop-based approach to breaching experiments, which surfaces the usually tacit and unspoken background expectancies implicated in the organisation of everyday life that have a powerful impact on the acceptability of future and emerging technologies. The other contribution is substantive, produced through participants’ efforts to repair the incongruity or “reality disjuncture” created by utopian and dystopian visions, and highlights the need to build social as well as computational accountability into autonomous systems, and to enable coordination and control.
Abstract
This article explores the importance of accountability to data protection (DP), and how it can be built into the Internet of Things (IoT). The need to build accountability into the IoT is ...motivated by the opaque nature of distributed data flows, inadequate consent mechanisms and lack of interfaces enabling end-user control over the behaviours of Internet-enabled devices. The lack of accountability precludes meaningful engagement by end users with their personal data and poses a key challenge to creating user trust in the IoT and the reciprocal development of the digital economy. The European Union General Data Protection Regulation 2016 (EU GDPR) seeks to remedy this particular problem by mandating that a rapidly developing technological ecosystem be made accountable. In doing so, it foregrounds new responsibilities for data controllers, including DP by design and default, and new data subject rights such as the right to data portability. While GDPR is ‘technologically neutral’, it is nevertheless anticipated that realizing the vision will turn upon effective technological development. Accordingly, this article examines the notion of accountability, how it has been translated into systems design recommendations for the IoT and how the IoT Databox puts key DP principles into practice.