Privacy, security and data protection in smart cities: a critical EU law perspective


CREATe Working Paper by Lilian Edwards: “Smart cities” are a buzzword of the moment. Although legal interest is growing, most academic responses at least in the EU, are still from the technological, urban studies, environmental and sociological rather than legal, sectors2 and have primarily laid emphasis on the social, urban, policing and environmental benefits of smart cities, rather than their challenges, in often a rather uncritical fashion3 . However a growing backlash from the privacy and surveillance sectors warns of the potential threat to personal privacy posed by smart cities . A key issue is the lack of opportunity in an ambient or smart city environment for the giving of meaningful consent to processing of personal data; other crucial issues include the degree to which smart cities collect private data from inevitable public interactions, the “privatisation” of ownership of both infrastructure and data, the repurposing of “big data” drawn from IoT in smart cities and the storage of that data in the Cloud.

This paper, drawing on author engagement with smart city development in Glasgow as well as the results of an international conference in the area curated by the author, argues that smart cities combine the three greatest current threats to personal privacy, with which regulation has so far failed to deal effectively; the Internet of Things(IoT) or “ubiquitous computing”; “Big Data” ; and the Cloud. While these three phenomena have been examined extensively in much privacy literature (particularly the last two), both in the US and EU, the combination is under-explored. Furthermore, US legal literature and solutions (if any) are not simply transferable to the EU because of the US’s lack of an omnibus data protection (DP) law. I will discuss how and if EU DP law controls possible threats to personal privacy from smart cities and suggest further research on two possible solutions: one, a mandatory holistic privacy impact assessment (PIA) exercise for smart cities: two, code solutions for flagging the need for, and consequences of, giving consent to collection of data in ambient environments….(More)

Toward WSIS 3.0: Adopting Next-Gen Governance Solutions for Tomorrow’s Information Society


Lea Kaspar  & Stefaan G. Verhulst at CircleID: “… Collectively, this process has been known as the “World Summit on the Information Society” (WSIS). During December 2015 in New York, twelve years after that first meeting in Geneva and with more than 3 billion people now online, member states of the United Nations unanimously adopted the final outcome document of the WSIS ten-year Review process.

The document (known as the WSIS+10 document) reflects on the progress made over the past decade and outlines a set of recommendations for shaping the information society in coming years. Among other things, it acknowledges the role of different stakeholders in achieving the WSIS vision, reaffirms the centrality of human rights, and calls for a number of measures to ensure effective follow-up.

For many, these represent significant achievements, leading observers to proclaim the outcome a diplomatic victory. However, as is the case with most non-binding international agreements, the WSIS+10 document will remain little more than a hollow guidepost until it is translated into practice. Ultimately, it is up to the national policy-makers, relevant international agencies, and the WSIS community as a whole to deliver meaningful progress towards achieving the WSIS vision.

Unfortunately, the WSIS+10 document provides little actual guidance for practitioners. What is even more striking, it reveals little progress in its understanding of emerging governance trends and methods since Geneva and Tunis, or how these could be leveraged in our efforts to harness the benefits of information and communication technologies (ICT).

As such, the WSIS remains a 20th-century approach to 21st-century challenges. In particular, the document fails to seek ways to make WSIS post 2015:

  • evidence-based in how to make decisions;
  • collaborative in how to measure progress; and
  • innovative in how to solve challenges.

Three approaches toward WSIS 3.0

Drawing on lessons in the field of governance innovation, we suggest in what follows three approaches, accompanied by practical recommendations, that could allow the WSIS to address the challenges raised by the information society in a more evidence-based, innovative and participatory way:

1. Adopt an evidence-based approach to WSIS policy making and implementation.

Since 2003, we have had massive experimentation in both developed and developing countries in a number of efforts to increase access to the Internet. We have seen some failures and some successes; above all, we have gained insight into what works, what doesn’t, and why. Unfortunately, much of the evidence remains scattered and ad-hoc, poorly translated into actionable guidance that would be effective across regions; nor is there any reflection on what we don’t know, and how we can galvanize the research and funding community to address information gaps. A few practical steps we could take to address this:….

2. Measure progress towards WSIS goals in a more open, collaborative way, founded on metrics and data developed through a bottom-up approach

The current WSIS+10 document has many lofty goals, many of which will remain effectively meaningless unless we are able to measure progress in concrete and specific terms. This requires the development of clear metrics, a process which is inevitably subjective and value-laden. Metrics and indicators must therefore be chosen with great care, particularly as they become points of reference for important decisions and policies. Having legitimate, widely-accepted indicators is critical. The best way to do this is to develop a participatory process that engages those actors who will be affected by WSIS-related actions and decisions. …These could include:…

3. Experiment with governance innovations to achieve WSIS objectives.

Over the last few years, we have seen a variety of innovations in governance that have provided new and often improved ways to solve problems and make decisions. They include, for instance:

  • The use of open and big data to generate new insights in both the problem and the solution space. We live in the age of abundant data — why aren’t we using it to inform our decision making? Data on the current landscape and the potential implications of policies could make our predictions and correlations more accurate.
  • The adoption of design thinking, agile development and user-focused research in developing more targeted and effective interventions. A linear approach to policy making with a fixed set of objectives and milestones allows little room for dealing with unforeseen or changing circumstances, making it difficult to adapt and change course. Applying lessons from software engineering — including the importance of feedback loops, continuous learning, and agile approach to project design — would allow policies to become more flexible and solutions more robust.
  • The application of behavioral sciences — for example, the concept of ‘nudging’ individuals to act in their own best interest or adopt behaviors that benefit society. How choices (e.g. to use new technologies) are presented and designed can be more powerful in informing adoption than laws, rules or technical standards.
  • The use of prizes and challenges to tap into the wisdom of the crowd to solve complex problems and identify new ideas. Resource constraints can be addressed by creating avenues for people/volunteers to act as resource in creating solutions, rather than being only their passive benefactors….(More)

Daedalus Issue on “The Internet”


Press release: “Thirty years ago, the Internet was a network that primarily delivered email among academic and government employees. Today, it is rapidly evolving into a control system for our physical environment through the Internet of Things, as mobile and wearable technology more tightly integrate the Internet into our everyday lives.

How will the future Internet be shaped by the design choices that we are making today? Could the Internet evolve into a fundamentally different platform than the one to which we have grown accustomed? As an alternative to big data, what would it mean to make ubiquitously collected data safely available to individuals as small data? How could we attain both security and privacy in the face of trends that seem to offer neither? And what role do public institutions, such as libraries, have in an environment that becomes more privatized by the day?

These are some of the questions addressed in the Winter 2016 issue of Daedalus on “The Internet.”  As guest editors David D. Clark (Senior Research Scientist at the MIT Computer Science and Artificial Intelligence Laboratory) and Yochai Benkler (Berkman Professor of Entrepreneurial Legal Studies at Harvard Law School and Faculty Co-Director of the Berkman Center for Internet and Society at Harvard University) have observed, the Internet “has become increasingly privately owned, commercial, productive, creative, and dangerous.”

Some of the themes explored in the issue include:

  • The conflicts that emerge among governments, corporate stakeholders, and Internet users through choices that are made in the design of the Internet
  • The challenges—including those of privacy and security—that materialize in the evolution from fixed terminals to ubiquitous computing
  • The role of public institutions in shaping the Internet’s privately owned open spaces
  • The ownership and security of data used for automatic control of connected devices, and
  • Consumer demand for “free” services—developed and supported through the sale of user data to advertisers….

Essays in the Winter 2016 issue of Daedalus include:

  • The Contingent Internet by David D. Clark (MIT)
  • Degrees of Freedom, Dimensions of Power by Yochai Benkler (Harvard Law School)
  • Edge Networks and Devices for the Internet of Things by Peter T. Kirstein (University College London)
  • Reassembling Our Digital Selves by Deborah Estrin (Cornell Tech and Weill Cornell Medical College) and Ari Juels (Cornell Tech)
  • Choices: Privacy and Surveillance in a Once and Future Internet by Susan Landau (Worcester Polytechnic Institute)
  • As Pirates Become CEOs: The Closing of the Open Internet by Zeynep Tufekci (University of North Carolina at Chapel Hill)
  • Design Choices for Libraries in the Digital-Plus Era by John Palfrey (Phillips Academy)…(More)

See also: Introduction

Digital Weberianism: Towards a reconceptualization of bureaucratic social order in the digital age


Working Paper by Chris Muellerleile & Susan Robertson: “The social infrastructures that the global economy relies upon are becoming dependent on digital code, big data, and algorithms. At the same time the digital is also changing the very nature of economic and social institutions. In this paper we attempt to make sense of the relationships between the emergence of digitalism, and transformations in both capitalism, and the ways that capitalism is regulated by digitized social relations. We speculate that the logic, rationalities, and techniques of Max Weber’s bureau, a foundational concept in his theory of modernity, helps explain the purported efficiency, objectivity, and rationality of digital technologies. We argue that digital rationality constitutes a common thread of social infrastructure that is increasingly overdetermining the nature of sociality. We employ the example of the smart city and the digitizing university to expose some of the contradictions of digital order, and we end by questioning what digital order might mean after the end of modernity….(More)”

Big Data Analysis: New Algorithms for a New Society


Book edited by Nathalie Japkowicz and Jerzy Stefanowski: “This edited volume is devoted to Big Data Analysis from a Machine Learning standpoint as presented by some of the most eminent researchers in this area.

It demonstrates that Big Data Analysis opens up new research problems which were either never considered before, or were only considered within a limited range. In addition to providing methodological discussions on the principles of mining Big Data and the difference between traditional statistical data analysis and newer computing frameworks, this book presents recently developed algorithms affecting such areas as business, financial forecasting, human mobility, the Internet of Things, information networks, bioinformatics, medical systems and life science. It explores, through a number of specific examples, how the study of Big Data Analysis has evolved and how it has started and will most likely continue to affect society. While the benefits brought upon by Big Data Analysis are underlined, the book also discusses some of the warnings that have been issued concerning the potential dangers of Big Data Analysis along with its pitfalls and challenges….(More)”

Privacy by design in big data


An overview of privacy enhancing technologies in the era of big data analytics by the European Union Agency for Network and Information Security (ENISA) : “The extensive collection and further processing of personal information in the context of big data analytics has given rise to serious privacy concerns, especially relating to wide scale electronic surveillance, profiling, and disclosure of private data. In order to allow for all the benefits of analytics without invading individuals’ private sphere, it is of utmost importance to draw the limits of big data processing and integrate the appropriate data protection safeguards in the core of the analytics value chain. ENISA, with the current report, aims at supporting this approach, taking the position that, with respect to the underlying legal obligations, the challenges of technology (for big data) should be addressed by the opportunities of technology (for privacy). To this end, in the present study we first explain the need to shift the discussion from “big data versus privacy” to “big data with privacy”, adopting the privacy and data protection principles as an essential value of big data, not only for the benefit of the individuals, but also for the very prosperity of big data analytics. In this respect, the concept of privacy by design is key in identifying the privacy requirements early at the big data analytics value chain and in subsequently implementing the necessary technical and organizational measures. Therefore, after an analysis of the proposed privacy by design strategies in the different phases of the big data value chain, we provide an overview of specific identified privacy enhancing technologies that we find of special interest for the current and future big data landscape. In particular, we discuss anonymization, the “traditional” analytics technique, the emerging area of encrypted search and privacy preserving computations, granular access control mechanisms, policy enforcement and accountability, as well as data provenance issues. Moreover, new transparency and access tools in big data are explored, together with techniques for user empowerment and control. Following the aforementioned work, one immediate conclusion that can be derived is that achieving “big data with privacy” is not an easy task and a lot of research and implementation is still needed. Yet, we find that this task can be possible, as long as all the involved stakeholders take the necessary steps to integrate privacy and data protection safeguards in the heart of big data, by design and by default. To this end, ENISA makes the following recommendations:

  • Privacy by design applied …
  • Decentralised versus centralised data analytics …
  • Support and automation of policy enforcement
  • Transparency and control….
  • User awareness and promotion of PETs …
  • A coherent approach towards privacy and big data ….(More)”

Big Data for Development: A Review of Promises and Challenges


Martin Hilbert in the Development Policy Review: “The article uses a conceptual framework to review empirical evidence and some 180 articles related to the opportunities and threats of Big Data Analytics for international development. The advent of Big Data delivers a cost-effective prospect for improved decision-making in critical development areas such as healthcare, economic productivity and security. At the same time, the well-known caveats of the Big Data debate, such as privacy concerns and human resource scarcity, are aggravated in developing countries by long-standing structural shortages in the areas of infrastructure, economic resources and institutions. The result is a new kind of digital divide: a divide in the use of data-based knowledge to inform intelligent decision-making. The article systematically reviews several available policy options in terms of fostering opportunities and minimising risks…..(More)”

Big Data Before the Web


Evan Hepler-Smith in the Wall Street Journal: “Sometime in the early 1950s, on a reservation in Wisconsin, a Menominee Indian man looked at an ink blot. An anthropologist recorded the man’s reaction according to a standard Rorschach-test protocol. The researcher submitted a copy of these notes to an enormous cache of records collected over the course of decades by American social scientists working among various “societies ‘other than our own.’ ” This entire collection of social-scientific data was photographed and printed in arrays of microscopic images on 3-by-5-inch cards. Sets of these cards were shipped to research libraries around the world. They gathered dust.

In the results of this Rorschach test, the anthropologist saw evidence of a culture eroded by modernity. Sixty years later, these documents also testify to the aspirations and fate of the social-scientific project for which they were generated. Deep within this forgotten Ozymandian card file sits the Menominee man’s reaction to Rorschach card VI: “It is like a dead planet. It seems to tell the story of a people once great who have lost . . . like something happened. All that’s left is the symbol.”

In “Database of Dreams: The Lost Quest to Catalog Humanity,” Rebecca Lemov delves into the ambitious efforts of mid-20th-century social scientists to build a “capacious and reliable science of the varieties of the human being” by generating an archive of human experience through interviews and tests and by storing the information on the high-tech media of the day.

 For these psychologists and anthropologists, the key to a universal human science lay in studying members of cultures in transition between traditional and modern ways of life and in rendering their individuality as data. Interweaving stories of social scientists, Native American research subjects and information technologies, Ms. Lemov presents a compelling account of “what ‘humanness’ came to mean in an age of rapid change in technological and social conditions.” Ms. Lemov, an associate professor of the history of science at Harvard University, follows two contrasting threads through a story that she calls “a parable for our time.” She shows, first, how collecting data about human experience shapes human experience and, second, how a high-tech data repository of the 1950s became, as she puts it, a “data ruin.”…(More) – See also: Database of Dreams: The Lost Quest to Catalog Humanity

OpenFDA: an innovative platform providing access to a wealth of FDA’s publicly available data


Paper by Taha A Kass-Hout et al in JAMIA: “The objective of openFDA is to facilitate access and use of big important Food and Drug Administration public datasets by developers, researchers, and the public through harmonization of data across disparate FDA datasets provided via application programming interfaces (APIs).

Materials and Methods: Using cutting-edge technologies deployed on FDA’s new public cloud computing infrastructure, openFDA provides open data for easier, faster (over 300 requests per second per process), and better access to FDA datasets; open source code and documentation shared on GitHub for open community contributions of examples, apps and ideas; and infrastructure that can be adopted for other public health big data challenges.

Results:Since its launch on June 2, 2014, openFDA has developed four APIs for drug and device adverse events, recall information for all FDA-regulated products, and drug labeling. There have been more than 20 million API calls (more than half from outside the United States), 6000 registered users, 20,000 connected Internet Protocol addresses, and dozens of new software (mobile or web) apps developed. A case study demonstrates a use of openFDA data to understand an apparent association of a drug with an adverse event. Conclusion With easier and faster access to these datasets, consumers worldwide can learn more about FDA-regulated products

Conclusion: With easier and faster access to these datasets, consumers worldwide can learn more about FDA-regulated products…(More)”

Big Data in the Policy Cycle: Policy Decision Making in the Digital Era


Paper by Johann Höchtl et al in the Journal of Organizational Computing and Electronic Commerce: “Although of high relevance to political science, the interaction between technological change and political change in the era of Big Data remains somewhat of a neglected topic. Most studies focus on the concept of e-government and e-governance, and on how already existing government activities performed through the bureaucratic body of public administration could be improved by technology. This paper attempts to build a bridge between the field of e-governance and theories of public administration that goes beyond the service delivery approach that dominates a large part of e-government research. Using the policy cycle as a generic model for policy processes and policy development, a new look on how policy decision making could be conducted on the basis of ICT and Big Data is presented in this paper….(More)”