The Value of Data: Towards a Framework to Redistribute It


Paper by Maria Savona: “This note attempts a systematisation of different pieces of literature that underpin the recent policy and academic debate on the value of data. It mainly poses foundational questions around the definition, economic nature and measurement of data value, and discusses the opportunity to redistribute it. It then articulates a framework to compare ways of implementing redistribution, distinguishing between data as capital, data as labour or data as an intellectual property. Each of these raises challenges, revolving around the notions of data property and data rights, that are also briefly discussed. The note concludes by indicating areas for policy considerations and a research agenda to shape the future structure of data governance more at large….(More)”.

Algorithmic futures: The life and death of Google Flu Trends


Vincent Duclos in Medicine Anthropology Theory: “In the last few years, tracking systems that harvest web data to identify trends, calculate predictions, and warn about potential epidemic outbreaks have proliferated. These systems integrate crowdsourced data and digital traces, collecting information from a variety of online sources, and they promise to change the way governments, institutions, and individuals understand and respond to health concerns. This article examines some of the conceptual and practical challenges raised by the online algorithmic tracking of disease by focusing on the case of Google Flu Trends (GFT). Launched in 2008, GFT was Google’s flagship syndromic surveillance system, specializing in ‘real-time’ tracking of outbreaks of influenza. GFT mined massive amounts of data about online search behavior to extract patterns and anticipate the future of viral activity. But it did a poor job, and Google shut the system down in 2015. This paper focuses on GFT’s shortcomings, which were particularly severe during flu epidemics, when GFT struggled to make sense of the unexpected surges in the number of search queries. I suggest two reasons for GFT’s difficulties. First, it failed to keep track of the dynamics of contagion, at once biological and digital, as it affected what I call here the ‘googling crowds’. Search behavior during epidemics in part stems from a sort of viral anxiety not easily amenable to algorithmic anticipation, to the extent that the algorithm’s predictive capacity remains dependent on past data and patterns. Second, I suggest that GFT’s troubles were the result of how it collected data and performed what I call ‘epidemic reality’. GFT’s data became severed from the processes Google aimed to track, and the data took on a life of their own: a trackable life, in which there was little flu left. The story of GFT, I suggest, offers insight into contemporary tensions between the indomitable intensity of collective life and stubborn attempts at its algorithmic formalization.Vincent DuclosIn the last few years, tracking systems that harvest web data to identify trends, calculate predictions, and warn about potential epidemic outbreaks have proliferated. These systems integrate crowdsourced data and digital traces, collecting information from a variety of online sources, and they promise to change the way governments, institutions, and individuals understand and respond to health concerns. This article examines some of the conceptual and practical challenges raised by the online algorithmic tracking of disease by focusing on the case of Google Flu Trends (GFT). Launched in 2008, GFT was Google’s flagship syndromic surveillance system, specializing in ‘real-time’ tracking of outbreaks of influenza. GFT mined massive amounts of data about online search behavior to extract patterns and anticipate the future of viral activity. But it did a poor job, and Google shut the system down in 2015. This paper focuses on GFT’s shortcomings, which were particularly severe during flu epidemics, when GFT struggled to make sense of the unexpected surges in the number of search queries. I suggest two reasons for GFT’s difficulties. First, it failed to keep track of the dynamics of contagion, at once biological and digital, as it affected what I call here the ‘googling crowds’. Search behavior during epidemics in part stems from a sort of viral anxiety not easily amenable to algorithmic anticipation, to the extent that the algorithm’s predictive capacity remains dependent on past data and patterns. Second, I suggest that GFT’s troubles were the result of how it collected data and performed what I call ‘epidemic reality’. GFT’s data became severed from the processes Google aimed to track, and the data took on a life of their own: a trackable life, in which there was little flu left. The story of GFT, I suggest, offers insight into contemporary tensions between the indomitable intensity of collective life and stubborn attempts at its algorithmic formalization….(More)”.

Innovation Partnerships: An effective but under-used tool for buying innovation


Claire Gamage at Challenging Procurement: “…in an era where demand for public sector services increases as budgets decrease, the public sector should start to consider alternative routes to procurement. …

What is the Innovation Partnership procedure?

In a nutshell, it is essentially a procurement process combined with an R&D contract. Authorities are then able to purchase the ‘end result’ of the R&D exercise, without having to undergo a new procurement procedure. Authorities may choose to appoint a number of partners to participate in the R&D phase, but may subsequently only purchase one/some of those solutions.

Why does this procedure result in more innovative solutions?

The procedure was designed to drive innovation. Indeed, it may only be used in circumstances where a solution is not already available on the open market. Therefore, participants in the Innovation Partnership will be asked to create something which does not already exist and should be tailored towards solving a particular problem or ‘challenge’ set by the authority.

This procedure may also be particularly attractive to SMEs/start-ups, who often find it easier to innovate in comparison with their larger competitors and therefore the purchasing authority is perhaps likely to obtain a more innovative product or service.

One of the key advantages of an Innovation Partnership is that the R&D phase is separate to the subsequent purchase of the solution. In other words, the authority is not (usually) under any obligation to purchase the ‘end result’ of the R&D exercise, but has the option to do so if it wishes. Therefore, it may be easier to discourage internal stakeholders from imposing selection criteria which inadvertently exclude SMEs/start-ups (e.g. minimum turnover requirements, parent company guarantees etc.), as the authority is not committed to actually purchasing at the end of the procurement process which will select the innovation partner(s)….(More)”.

The Urban Computing Foundation


About: “The Urban Computing Foundation is a neutral forum for accelerating open source and community development that improves mobility, safety, road infrastructure, traffic congestion and energy consumption in connected cities.

As cities and transportation networks evolve into ever-more complicated systems, urban computing is emerging as an important field to bridge the divide between engineering, visualization, and traditional transportation systems analysis. These advancements are dependent on compatibility among many technologies across different public and private organizations. The Foundation provides the forum to collaborate on a common set of open source tools for developers building autonomous vehicles and smart infrastructure.

The Urban Computing Foundation’s mission is to enable developers, data scientists, visualization specialists and engineers to improve urban environments, human life quality, and city operation systems.build connected urban infrastructure. We do this through an open governance model that encourages participation and technical contribution, and by providing a framework for long term stewardship by companies and individuals invested in open urban computing’s success….(More)”.

Restrictions on Privacy and Exploitation in the Digital Economy: A Competition Law Perspective


Paper by Nicholas Economides and Ioannis Lianos: “The recent controversy on the intersection of competition law with the protection of privacy, following the emergence of big data and social media is a major challenge for competition authorities worldwide. Recent technological progress in data analytics may greatly facilitate the prediction of personality traits and attributes from even a few digital records of human behaviour.


There are different perspectives globally as to the level of personal data protection and the role competition law may play in this context, hence the discussion of integrating such concerns in competition law enforcement may be premature for some jurisdictions. However, a market failure approach may provide common intellectual foundations for the assessment of harms associated to the exploitation of personal data, even when the specific legal system does not formally recognize a fundamental right to privacy.


The paper presents a model of market failure based on a requirement provision in the acquisition of personal information from users of other products/services. We establish the economic harm from the market failure and the requirement using the traditional competition law toolbox and focusing more on situations in which the restriction on privacy may be analysed as a form of exploitation. Eliminating the requirement and the market failure by creating a functioning market for the sale of personal information is imperative. This emphasis on exploitation does not mean that restrictions on privacy may not result from exclusionary practices. However, we analyse this issue in a separate study.


Besides the traditional analysis of the requirement and market failure, we note that there are typically informational asymmetries between the data controller and the data subject. The latter may not be aware that his data was harvested, in the first place, or that the data will be processed by the data controller for a different purpose or shared and sold to third parties. The exploitation of personal data may also result from economic coercion, on the basis of resource-dependence or lock-in of the user, the latter having no other choice, in order to enjoy the consumption of a specific service provided by the data controller or its ecosystem, in particular in the presence of dominance, than to consent to the harvesting and use of his data. A behavioural approach would also emphasise the possible internalities (demand-side market failures) coming out of the bounded rationality, or the fact that people do not internalise all consequences of their actions and face limits in their cognitive capacities.
The paper also addresses the way competition law could engage with exploitative conduct leading to privacy harm, both for ex ante and ex post enforcement.


With regard to ex ante enforcement, the paper explores how privacy concerns may be integrated in merger control as part of the definition of product quality, the harm in question being merely exploitative (the possibility the data aggregation provides to the merged entity to exploit (personal) data in ways that harm directly consumers), rather than exclusionary (harming consumers by enabling the merged entity to marginalise a rival with better privacy policies), which is examined in a separate paper.


With regard to ex post enforcement, the paper explores different theories of harm that may give rise to competition law concerns and suggest specific tests for their assessment. In particular, we analyse old and new exploitative theories of harm relating to excessive data extraction, personalised pricing, unfair commercial practices and trading conditions, exploitative requirement contracts, behavioural manipulation.
We are in favour of collective action to restore the conditions of a well-functioning data market and the paper makes several policy recommendations….(More)”.

Leveraging Private Data for Public Good: A Descriptive Analysis and Typology of Existing Practices


New report by Stefaan Verhulst, Andrew Young, Michelle Winowatan. and Andrew J. Zahuranec: “To address the challenges of our times, we need both new solutions and new ways to develop those solutions. The responsible use of data will be key toward that end. Since pioneering the concept of “data collaboratives” in 2015, The GovLab has studied and experimented with innovative ways to leverage private-sector data to tackle various societal challenges, such as urban mobility, public health, and climate change.

While we have seen an uptake in normative discussions on how data should be shared, little analysis exists of the actual practice. This paper seeks to address that gap and seeks to answer the following question: What are the variables and models that determine functional access to private sector data for public good? In Leveraging Private Data for Public Good: A Descriptive Analysis and Typology of Existing Practices, we describe the emerging universe of data collaboratives and develop a typology of six practice areas. Our goal is to provide insight into current applications to accelerate the creation of new data collaboratives. The report outlines dozens of examples, as well as a set of recommendations to enable more systematic, sustainable, and responsible data collaboration….(More)”

City Innovation


Report and interactive map by CityLab, Bloomberg Philanthropies and the OECD: “New Innovation helps local governments create an ecosystem that promotes experimentation and creativity to improve the public welfare of residents in cities around the world.

City governments are ushering in a new era of local public sector innovation that promotes experimentation and flexibility, and also takes into account the social needs of citizens to manage evolving urban systems. The goal of this report is to understand how municipalities can enhance their ability to use innovation to deliver better results for their residents….

This site identifies and shares how cities around the world are investing in innovation, to ensure they’re constantly assessing and improving how they’re tackling problems and improving the lives of residents. This map is based on an initial survey of cities in OECD and non-OECD countries. The city information reflects data gathered from the city administration at the time of the survey….(More)”

Beyond the Valley


Book by Ramesh Srinivasan: “How to repair the disconnect between designers and users, producers and consumers, and tech elites and the rest of us: toward a more democratic internet.

In this provocative book, Ramesh Srinivasan describes the internet as both an enabler of frictionless efficiency and a dirty tangle of politics, economics, and other inefficient, inharmonious human activities. We may love the immediacy of Google search results, the convenience of buying from Amazon, and the elegance and power of our Apple devices, but it’s a one-way, top-down process. We’re not asked for our input, or our opinions—only for our data. The internet is brought to us by wealthy technologists in Silicon Valley and China. It’s time, Srinivasan argues, that we think in terms beyond the Valley.

Srinivasan focuses on the disconnection he sees between designers and users, producers and consumers, and tech elites and the rest of us. The recent Cambridge Analytica and Russian misinformation scandals exemplify the imbalance of a digital world that puts profits before inclusivity and democracy. In search of a more democratic internet, Srinivasan takes us to the mountains of Oaxaca, East and West Africa, China, Scandinavia, North America, and elsewhere, visiting the “design labs” of rural, low-income, and indigenous people around the world. He talks to a range of high-profile public figures—including Elizabeth Warren, David Axelrod, Eric Holder, Noam Chomsky, Lawrence Lessig, and the founders of Reddit, as well as community organizers, labor leaders, and human rights activists. To make a better internet, Srinivasan says, we need a new ethic of diversity, openness, and inclusivity, empowering those now excluded from decisions about how technologies are designed, who profits from them, and who are surveilled and exploited by them….(More)”

Could AI Drive Transformative Social Progress? What Would This Require?


Paper by Edward (Ted) A. Parson et al: “In contrast to popular dystopian speculation about the societal impacts of widespread AI deployment, we consider AI’s potential to drive a social transformation toward greater human liberty, agency, and equality. The impact of AI, like all technology, will depend on both properties of the technology and the economic, social, and political conditions of its deployment and use. We identify conditions of each type – technical characteristics and socio-political context – likely to be conducive to such large-scale beneficial impacts.

Promising technical characteristics include decision-making structures that are tentative and pluralistic, rather than optimizing a single-valued objective function under a single characterization of world conditions; and configuring the decision-making of AI-enabled products and services exclusively to advance the interests of their users, subject to relevant social values, not those of their developers or vendors. We explore various strategies and business models for developing and deploying AI-enabled products that incorporate these characteristics, including philanthropic seed capital, crowd-sourcing, open-source development, and sketch various possible ways to scale deployment thereafter….(More)”.

Type, tweet, tap, and pass: How smart city technology is creating a transactional citizen


Paper by Peter A.Johnson, Pamela J.Robinson, andSimonePhilpot: “In the current push for smart city programs around the world, there is a significant focus on enabling transactions between citizen and government. Though traditionally there have always been transactional elements between government and citizen, for example payment of taxes in exchange for services, or voting in exchange for representation, the rise of modern smartphone and smart city technologies have further enabled micro-transactions between citizen, government, and information broker. We conceptualize how the modern smart city, as both envisaged and enacted, incorporates the citizen not necessarily as a whole actor, but as a series of micro-transactions encoded on the real-time landscape of the city. This transactional citizen becomes counted by smart city sensors and integrated into smart city decision-making through the use of certain preferred platforms.

To approach this shift from traditional forms of citizen/city interaction towards micro-transactions, we conceptualize four broad modes of transaction; type (intentional contribution), tweet (intermediated by third party), tap (convened or requested transaction), and pass (ambient transaction based on movement). These four modes are used to frame critical questions of how citizens interact with government in the emerging age of the smart city, and how these interactions impact the relationship between citizen and government, introducing new avenues for private sector influence….(More)”