New UN resolution on the right to privacy in the digital age: crucial and timely


Deborah Brown at the Internet Policy Review: “The rapid pace of technological development enables individuals all over the world to use new information and communications technologies (ICTs) to improve their lives. At the same time, technology is enhancing the capacity of governments, companies and individuals to undertake surveillance, interception and data collection, which may violate or abuse human rights, in particular the right to privacy. In this context, the UN General Assembly’s Third Committee adoption on 21 November of a new resolution on the right to privacy in the digital age comes as timely and crucial for protecting the right to privacy in light of new challenges.

As with previous UN resolutions on this topic, the resolution adopted on 21 November 2016 recognises the importance of respecting international commitments in relation to the right to privacy. It underscores that any legitimate concerns states may have with regard to their security can and should be addressed in a manner consistent with obligations under international human rights law.

Recognising that more and more personal data is being collected, processed, and shared, this year’s resolution expresses concern about the sale or multiple re-sales of personal data, which often happens without the individual’s free, explicit and informed consent. It calls for the strengthening of prevention of and protection against such violations, and calls on states to develop preventative measures, sanctions, and remedies.

This year, the resolution more explicitly acknowledges the role of the private sector. It calls on states to put in place (or maintain) effective sanctions and remedies to prevent the private sector from committing violations and abuses of the right to privacy. This is in line with states’ obligations under the UN Guiding Principles on Business and Human Rights, which require states to protect against abuses by businesses within their territories or jurisdictions. The resolution specifically calls on states to refrain from requiring companies to take steps that interfere with the right to privacy in an arbitrary or unlawful way. With respect to companies, it recalls the responsibility of the private sector to respect human rights, and specifically calls on them to inform users about company policies that may impact their right to privacy….(More)”

The ethical impact of data science


Theme issue of Phil. Trans. R. Soc. A compiled and edited by Mariarosaria Taddeo and Luciano Floridi: “This theme issue has the founding ambition of landscaping data ethics as a new branch of ethics that studies and evaluates moral problems related to data (including generation, recording, curation, processing, dissemination, sharing and use), algorithms (including artificial intelligence, artificial agents, machine learning and robots) and corresponding practices (including responsible innovation, programming, hacking and professional codes), in order to formulate and support morally good solutions (e.g. right conducts or right values). Data ethics builds on the foundation provided by computer and information ethics but, at the same time, it refines the approach endorsed so far in this research field, by shifting the level of abstraction of ethical enquiries, from being information-centric to being data-centric. This shift brings into focus the different moral dimensions of all kinds of data, even data that never translate directly into information but can be used to support actions or generate behaviours, for example. It highlights the need for ethical analyses to concentrate on the content and nature of computational operations—the interactions among hardware, software and data—rather than on the variety of digital technologies that enable them. And it emphasizes the complexity of the ethical challenges posed by data science. Because of such complexity, data ethics should be developed from the start as a macroethics, that is, as an overall framework that avoids narrow, ad hoc approaches and addresses the ethical impact and implications of data science and its applications within a consistent, holistic and inclusive framework. Only as a macroethics will data ethics provide solutions that can maximize the value of data science for our societies, for all of us and for our environments….(More)”

Table of Contents:

  • The dynamics of big data and human rights: the case of scientific research; Effy Vayena, John Tasioulas
  • Facilitating the ethical use of health data for the benefit of society: electronic health records, consent and the duty of easy rescue; Sebastian Porsdam Mann, Julian Savulescu, Barbara J. Sahakian
  • Faultless responsibility: on the nature and allocation of moral responsibility for distributed moral actions; Luciano Floridi
  • Compelling truth: legal protection of the infosphere against big data spills; Burkhard Schafer
  • Locating ethics in data science: responsibility and accountability in global and distributed knowledge production systems; Sabina Leonelli
  • Privacy is an essentially contested concept: a multi-dimensional analytic for mapping privacy; Deirdre K. Mulligan, Colin Koopman, Nick Doty
  • Beyond privacy and exposure: ethical issues within citizen-facing analytics; Peter Grindrod
  • The ethics of smart cities and urban science; Rob Kitchin
  • The ethics of big data as a public good: which public? Whose good? Linnet Taylor
  • Data philanthropy and the design of the infraethics for information societies; Mariarosaria Taddeo
  • The opportunities and ethics of big data: practical priorities for a national Council of Data Ethics; Olivia Varley-Winter, Hetan Shah
  • Data science ethics in government; Cat Drew
  • The ethics of data and of data science: an economist’s perspective; Jonathan Cave
  • What’s the good of a science platform? John Gallacher

 

From Tech-Driven to Human-Centred: Opengov has a Bright Future Ahead


Essay by Martin Tisné: ” The anti-corruption and transparency field ten years ago was in pre-iPhone mode. Few if any of us spoke of the impact or relevance of technology to what would become known as the open government movement. When the wave of smart phone and other technology hit from the late 2000s onwards, it hit hard, and scaled fast. The ability of technology to create ‘impact at scale’ became the obvious truism of our sector, so much so that pointing out the failures of techno-utopianism became a favorite pastime for pundits and academics. The technological developments of the next ten years will be more human-centered — less ‘build it and they will come’ — and more aware of the un-intended consequences of technology (e.g. the fairness of Artifical Intelligence decision making) whilst still being deeply steeped in the technology itself.

By 2010, two major open data initiatives had launched and were already seen as successful in the US and UK, one of President Obama’s first memorandums was on openness and transparency, and an international research project had tracked 63 different instances of uses of technology for transparency around the world (from Reclamos in Chile, to I Paid a Bribe in India, via Maji Matone in Tanzania). Open data projects numbered over 200 world-wide within barely a year of data.gov.uk launching and to everyone’s surprise topped the list of Open Government Partnership commitments a few years hence.

The technology genie won’t go back into the bottle: the field will continue to grow alongside technological developments. But it would take a bold or foolish pundit to guess which of blockchain or other developments will have radically changed the field by 2025.

What is clearer is that the sector is more questioning towards technology, more human-centered both in the design of those technologies and in seeking to understand and pre-empt their impact….

We’ve moved from cyber-utopianism less than ten years ago to born-digital organisations taking a much more critical look at the deployment of technology. The evangelical phase of the open data movement is coming to an end. The movement no longer needs to preach the virtues of unfettered openness to get a foot in the door. It seeks to frame the debate as to whether, when and how data might legitimately be shared or closed, and what impacts those releases may have on privacy, surveillance, discrimination. An open government movement that is more human-centered and aware of the un-intended consequences of technology, has a bright and impactful future ahead….(More)”

Federal Privacy Council’s Law Library


Federal Privacy Council: “The Law Library is a compilation of information about and links to select Federal laws related to the creation, collection, use, processing, storage, maintenance, dissemination, disclosure, and disposal of personally identifiable information (PII) by departments and agencies within the Federal Government. The Law Library does not include all laws that are relevant to privacy or the management of PII in the Federal Government.

The Law Library only includes laws applicable to the Federal Government. Although some of the laws included may also be applicable to entities outside of the Federal Government, the information provided on the Law Library pages is strictly limited to the application of those laws to the Federal Government; the information provided does not in any way address the application of any law to the private sector or other non-Federal entities.

The Law Library pages have been prepared by members of the Federal Privacy Council and consist of information from and links to other Federal Government websites. The Federal Privacy Council is not responsible for the content of any third-party website, and links to other websites do not constitute or imply endorsement or recommendation of those sites or the information they provide.

The material in the Law Library is provided for informational purposes only. The information provided may not reflect current legal developments or agency-specific requirements, and it may not be correct or complete. The Federal Privacy Council does not have authority to provide legal advice, to set policies for the Federal Government, or to represent the views of the Federal Government or the views of any agency within the Federal Government; accordingly, the information on this website in no way constitutes policy or legal advice, nor does it in any way reflect Federal Government views or opinions.  Agencies shall consult law, regulation, and policy, including OMB guidance, to understand applicable requirements….(More)”

Data Ethics – The New Competitive Advantage


Book by Gry Hasselbalch and Pernille Tranberg: “…describes over 50 cases of mainly private companies working with data ethics to varying degrees

Respect for privacy and the right to control one’s own data are becoming key parameters to gain a competitive edge in today’s business world. Companies, organisations and authorities which view data ethics as a social responsibility,giving it the same importance as environmental awareness and respect for human rights,are tomorrow’s winners. Digital trust is paramount to digital growth and prosperity.
This book combines broad trend analyses with case studies to examine companies which use data ethics to varying degrees. The authors make the case that citizens and consumers are no longer just concerned about a lack of control over their data, but they also have begun to act. In addition, they describe alternative business models, advances in technology and a new European data protection regulation, all of which combine to foster a growing market for data-ethical products and services….(More).

Learning Privacy Expectations by Crowdsourcing Contextual Informational Norms


 at Freedom to Tinker: “The advent of social apps, smart phones and ubiquitous computing has brought a great transformation to our day-to-day life. The incredible pace with which the new and disruptive services continue to emerge challenges our perception of privacy. To keep apace with this rapidly evolving cyber reality, we need to devise agile methods and frameworks for developing privacy-preserving systems that align with evolving user’s privacy expectations.

Previous efforts have tackled this with the assumption that privacy norms are provided through existing sources such law, privacy regulations and legal precedents. They have focused on formally expressing privacy norms and devising a corresponding logic to enable automatic inconsistency checks and efficient enforcement of the logic.

However, because many of the existing regulations and privacy handbooks were enacted well before the Internet revolution took place, they often lag behind and do not adequately reflect the application of logic in modern systems. For example, the Family Rights and Privacy Act (FERPA) was enacted in 1974, long before Facebook, Google and many other online applications were used in an educational context. More recent legislation faces similar challenges as novel services introduce new ways to exchange information, and consequently shape new, unconsidered information flows that can change our collective perception of privacy.

Crowdsourcing Contextual Privacy Norms

Armed with the theory of Contextual Integrity (CI) in our work, we are exploring ways to uncover societal norms by leveraging the advances in crowdsourcing technology.

In our recent paper, we present the methodology that we believe can be used to extract a societal notion of privacy expectations. The results can be used to fine tune the existing privacy guidelines as well as get a better perspective on the users’ expectations of privacy.

CI defines privacy as collection of norms (privacy rules) that reflect appropriate information flows between different actors. Norms capture who shares what, with whom, in what role, and under which conditions. For example, while you are comfortable sharing your medical information with your doctor, you might be less inclined to do so with your colleagues.

We use CI as a proxy to reason about privacy in the digital world and a gateway to understanding how people perceive privacy in a systematic way. Crowdsourcing is a great tool for this method. We are able to ask hundreds of people how they feel about a particular information flow, and then we can capture their input and map it directly onto the CI parameters. We used a simple template to write Yes-or-No questions to ask our crowdsourcing participants:

“Is it acceptable for the [sender] to share the [subject’s] [attribute] with [recipient] [transmission principle]?”

For example:

“Is it acceptable for the student’s professor to share the student’s record of attendance with the department chair if the student is performing poorly? ”

In our experiments, we leveraged Amazon’s Mechanical Turk (AMT) to ask 450 turkers over 1400 such questions. Each question represents a specific contextual information flow that users can approve, disapprove or mark under the Doesn’t Make Sense category; the last category could be used when 1) the sender is unlikely to have the information, 2) the receiver would already have the information, or 3) the question is ambiguous….(More)”

Big Data Is Not a Monolith


Book edited by Cassidy R. Sugimoto, Hamid R. Ekbia and Michael Mattioli: “Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies.

The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data’s ramifications. The contributors look at big data’s effect on individuals as it exerts social control through monitoring, mining, and manipulation; big data and society, examining both its empowering and its constraining effects; big data and science, considering issues of data governance, provenance, reuse, and trust; and big data and organizations, discussing data responsibility, “data harm,” and decision making….(More)”

Nudges for Privacy and Security: Understanding and Assisting Users’ Choices Online


Paper by Alessandro Acquisti et al: “Advancements in information technology often task users with complex and consequential privacy and security decisions. A growing body of research has investigated individuals’ choices in the presence of privacy and information security trade-offs, the decision-making hurdles affecting those choices, and ways to mitigate those hurdles. This article provides a multi-disciplinary assessment of the literature pertaining to privacy and security decision making. It focuses on research on assisting individuals’ privacy and security choices with soft paternalistic interventions that nudge users towards more beneficial choices. The article discusses potential benefits of those interventions, highlights their shortcomings, and identifies key ethical, design, and research challenges….(More)”

The Future of Drone Use: Opportunities and Threats from Ethical and Legal Perspectives


Book by Bart Custers: “Given the popularity of drones and the fact that they are easy and cheap to buy, it is generally expected that the ubiquity of drones will significantly increase within the next few years. This raises questions as to what is technologically feasible (now and in the future), what is acceptable from an ethical point of view and what is allowed from a legal point of view. Drone technology is to some extent already available and to some extent still in development. The aim and scope of this book is to map the opportunities and threats associated with the use of drones and to discuss the ethical and legal issues of the use of drones.
This book provides an overview of current drone technologies and applications and of what to expect in the next few years. The question of how to regulate the use of drones in the future is addressed, by considering conditions and contents of future drone legislation and by analyzing issues surrounding privacy and safeguards that can be taken. As such, this book is valuable to scholars in several disciplines, such as law, ethics, sociology, politics and public administration, as well as to practitioners and others who may be confronted with the use of drones in their work, such as professionals working in the military, law enforcement, disaster management and infrastructure management. Individuals and businesses with a specific interest in drone use may also find in the nineteen contributions contained in this volume unexpected perspectives on this new field of research and innovation….(More)”

Reframing Data Transparency


“Recently, the Centre for Information Policy Leadership (“CIPL”) at Hunton & Williams LLP, a privacy and information policy think tank based in Brussels, London and Washington, D.C., and Telefónica, one of the largest telecommunications company in the world, issued a joint white paper on Reframing Data Transparency (the “white paper”). The white paper was the outcome of a June 2016 roundtable held by the two organizations in London, in which senior business leaders, Data Privacy Officers, lawyers and academics discussed the importance of user-centric transparency to the data driven economy….The issues explored during the roundtable and in the white paper include the following:

  • The transparency deficit in the digital age. There is a growing gap between traditional, legal privacy notices and user-centric transparency that is capable of delivering understandable and actionable information concerning an organization’s data use policies and practices, including why it processes data, what the benefits are to individuals and society, how it protects the data and how users can manage and control the use of their data.
  • The impact of the transparency deficit. The transparency deficit undermines customer trust and customers’ ability to participate more effectively in the digital economy.
  • Challenges of delivering user-centric transparency. In a connected world where there may be no direct relationship between companies and their end users, both transparency and consent as a basis for processing are particularly challenging.
  • Transparency as a multistakeholder challenge. Transparency is not solely a legal issue, but a multistakeholder challenge, which requires engagement of regulators, companies, individuals, behavioral economists, social scientists, psychologists and user experience specialists.
  • The role of data protection authorities (“DPAs”). DPAs play a key role in promoting and incentivizing effective data transparency approaches and tools.
  • The role of companies. Data transparency is a critical business issue because transparency drives digital trust as well as business opportunities. Organizations must innovate on how to deliver user-centric transparency. Data driven companies must research and develop new approaches to transparency that explain the value exchange between customers and companies and the companies’ data practices, and create tools that enable their customers to exercise effective engagement and control.
  • The importance of empowering individuals. It is crucial to support and enhance individuals’ digital literacy, which includes an understanding of the uses of personal data and the benefits of data processing, as well as knowledge of relevant privacy rights and the data management tools that are available to them. Government bodies, regulators and industry should be involved in educating the public regarding digital literacy. Such education should take place in schools and universities, and through consumer education campaigns. Transparency is the foundation and sine qua non of individual empowerment.
  • The role of behavioral economists, social scientists, psychologists and user experience specialists. Experts from these disciplines will be crucial in developing user-centric transparency and controls….(More)”.