The economics of Business to Government data sharing


Paper by Bertin Martens and Nestor Duch Brown: “Data and information are fundamental pieces for effective evidence-based policy making and provision of public services. In recent years, some private firms have been collecting large amounts of data, which, were they available to governments, could greatly improve their capacity to take better policy decisions and to increase social welfare. Business-to-Government (B2G) data sharing can result in substantial benefits for society. It can save costs to governments by allowing them to benefit from the use of data collected by businesses without having to collect the same data again. Moreover, it can support the production of new and innovative outputs based on the shared data by different users. Finally, the data available to government may give only an incomplete or even biased picture, while aggregating complementary datasets shared by different parties (including businesses) may result in improved policies with strong social welfare benefits.


The examples assembled by the High Level Expert Group on B2G data sharing show that most of the current B2G data transactions remain one-off experimental pilot projects that do not seem to be sustainable over time. Overall, the volume of B2G operations still seems to be relatively small and clearly sub-optimal from a social welfare perspective. The market does not seem to scale compared to the economic potential for welfare gains in society. There are likely to be significant potential economic benefits from additional B2G data sharing operations. These could be enabled by measures that would seek to improve their governance conditions to contribute to increase the overall number of transactions. To design such measures, it is important to understand the nature of the current barriers for B2G data sharing operations. In this paper, we focus on the more important barriers from an economic perspective: (a) monopolistic data markets, (b) high transaction costs and perceived risks in data sharing and (c) a lack of incentives for private firms to contribute to the production of public benefits. The following reflections are mainly conceptual, since there is currently little quantitative empirical evidence on the different aspects of B2G transactions.

  • Monopolistic data markets. Some firms -like big tech companies for instance- may be in a privileged position as the exclusive providers of the type of data that a public body seeks to access. This position enables the firms to charge a high price for the data beyond a reasonable rate of return on costs. While a monopolistic market is still a functioning market, the resulting price may lead to some governments not being able or willing to purchase the data and therefore may cause social welfare losses. Nonetheless, monopolistic pricing may still be justified from an innovation perspective: it strengthens incentives to invest in more and better data collection systems and thereby increases the supply of data in the long run. In some cases, the data seller may be in a position to price-discriminate between commercial buyers and a public body, charging a lower price to the latter since the data would not be used for commercial purposes.
  • High transaction costs and perceived risks. An important barrier for data sharing comes from the ex-ante costs related to finding a suitable data sharing partner, negotiating a contractual arrangement, re-formatting and cleaning the data, among others. Potentially interested public bodies may not be aware of available datasets or may not be in a position to handle them or understand their advantages and disadvantages. There may also be ex-post risks related to uncertainties in the quality and/or usefulness of the data, the technical implementation of the data sharing deal, ensuring compliance with the agreed conditions, the risk of data leaks to unauthorized third-parties and exposure of personal and confidential data.
  • Lack of incentives. Firms may be reluctant to share data with governments because it might have a negative impact on them. This could be due to suspicions that the data delivered might be used to implement market regulations and to enforce competition rules that could negatively affect firms’ profits. Moreover, if firms share data with government under preferential conditions, they may have difficulties justifying the foregone profit to shareholders, since the benefits generated by better policies or public services fuelled by the private data will occur to society as a whole and are often difficult to express in monetary terms. Finally, firms might be afraid of entering into a competitive disadvantage if they provide data to public bodies – perhaps under preferential conditions – and their competitors do not.

Several mechanisms could be designed to solve the barriers that may be holding back B2G data sharing initiatives. One would be to provide stronger incentives for the data supplier firm to engage in this type of transactions. These incentives can be direct, i.e., monetary, or indirect, i.e., reputational (e.g. as part of corporate social responsibility programmes). Another way would be to ascertain the data transfer by making the transaction mandatory, with a fair cost compensation. An intermediate way would be based on solutions that seek to facilitate voluntary B2G operations without mandating them, for example by reducing the transaction costs and perceived risks for the provider data supplier, e.g. by setting up trusted data intermediary platforms, or appropriate contractual provisions. A possible EU governance framework for B2G data sharing operations could cover these options….(More)”.

Building Capacity for Evidence-Informed Policy-Making


OECD Report: “This report analyses the skills and capacities governments need to strengthen evidence-informed policy-making (EIPM) and identifies a range of possible interventions that are available to foster greater uptake of evidence. Increasing governments’ capacity for evidence-informed is a critical part of good public governance. However, an effective connection between the supply and the demand for evidence in the policy-making process remains elusive. This report offers concrete tools and a set of good practices for how the public sector can support senior officials, experts and advisors working at the political/administrative interface. This support entails investing in capability, opportunity and motivation and through behavioral changes. The report identifies a core skillset for EIPM at the individual level, including the capacity for understanding, obtaining, assessing, using, engaging with stakeholders, and applying evidence, which was developed in collaboration with the European Commission Joint Research Centre. It also identifies a set of capacities at the organisational level that can be put in place across the machinery of government, throughout the role of interventions, strategies and tools to strengthen these capacities. The report concludes with a set of recommendations to assist governments in building their capacities…(More)”.

The Razor’s Edge: Liberalizing the Digital Surveillance Ecosystem


Report by CNAS: “The COVID-19 pandemic is accelerating global trends in digital surveillance. Public health imperatives, combined with opportunism by autocratic regimes and authoritarian-leaning leaders, are expanding personal data collection and surveillance. This tendency toward increased surveillance is taking shape differently in repressive regimes, open societies, and the nation-states in between.

China, run by the Chinese Communist Party (CCP), is leading the world in using technology to enforce social control, monitor populations, and influence behavior. Part of maximizing this control depends on data aggregation and a growing capacity to link the digital and physical world in real time, where online offenses result in brisk repercussions. Further, China is increasing investments in surveillance technology and attempting to influence the patterns of technology’s global use through the export of authoritarian norms, values, and governance practices. For example, China champions its own technology standards to the rest of the world, while simultaneously peddling legislative models abroad that facilitate access to personal data by the state. Today, the COVID-19 pandemic offers China and other authoritarian nations the opportunity to test and expand their existing surveillance powers internally, as well as make these more extensive measures permanent.

Global swing states are already exhibiting troubling trends in their use of digital surveillance, including establishing centralized, government-held databases and trading surveillance practices with authoritarian regimes. Amid the pandemic, swing states like India seem to be taking cues from autocratic regimes by mandating the download of government-enabled contact-tracing applications. Yet, for now, these swing states appear responsive to their citizenry and sensitive to public agitation over privacy concerns.

Today, the COVID-19 pandemic offers China and other authoritarian nations the opportunity to test and expand their existing surveillance powers internally, as well as make these more extensive measures permanent.

Open societies and democracies can demonstrate global surveillance trends similar to authoritarian regimes and swing states, including the expansion of digital surveillance in the name of public safety and growing private sector capabilities to collect and analyze data on individuals. Yet these trends toward greater surveillance still occur within the context of pluralistic, open societies that feature ongoing debates about the limits of surveillance. However, the pandemic stands to shift the debate in these countries from skepticism over personal data collection to wider acceptance. Thus far, the spectrum of responses to public surveillance reflects the diversity of democracies’ citizenry and processes….(More)”.

Bringing Structure and Design to Data Governance


Report by John Wilbanks et al: “Before COVID-19 took over the world, the Governance team at Sage Bionetworks had started working on an analysis of data governance structures and systems to be published as a “green paper” in late 2020. Today we’re happy to publicly release that paper, Mechanisms to Govern Responsible Sharing of Open Data: A Progress Report.

In the paper, we provide a landscape analysis of models of governance for open data sharing based on our observations in the biomedical sciences. We offer an overview of those observations and show areas where we think this work can expand to supply further support for open data sharing outside the sciences.

The central argument of this paper is that the “right” system of governance is determined by first understanding the nature of the collaborative activities intended. These activities map to types of governance structures, which in turn can be built out of standardized parts — what we call governance design patterns. In this way, governance for data science can be easy to build, follow key laws and ethics regimes, and enable innovative models of collaboration. We provide an initial survey of structures and design patterns, as well as examples of how we leverage this approach to rapidly build out ethics-centered governance in biomedical research.

While there is no one-size-fits-all solution, we argue for learning from ongoing data science collaborations and building on from existing standards and tools. And in so doing, we argue for data governance as a discipline worthy of expertise, attention, standards, and innovation.

We chose to call this report a “green paper” in recognition of its maturity and coverage: it’s a snapshot of our data governance ecosystem in biomedical research, not the world of all data governance, and the entire field of data governance is in its infancy. We have licensed the paper under CC-BY 4.0 and published it in github via Manubot in hopes that the broader data governance community might fill in holes we left, correct mistakes we made, add references and toolkits and reference implementations, and generally treat this as a framework for talking about how we share data…(More)”.

Regulating Social Media: The Fight Over Section 230 and Beyond


Report by Paul M. Barrett: “Recently, Section 230 of the Communications Decency Act of 1996 has come under sharp attack from members of both political parties, including presidential candidates Donald Trump and Joe Biden. The foundational law of the commercial internet, Section 230 does two things: It protects platforms and websites from most lawsuits related to content posted by third parties. And it guarantees this shield from liability even if the platforms and sites actively police the content they host. This protection has encouraged internet companies to innovate and grow, even as it has raised serious questions about whether social media platforms adequately self-regulate harmful content. In addition to the assaults by Trump and Biden, members of Congress have introduced a number of bills designed to limit the reach of Section 230. Some critics have asserted unrealistically that repealing or curbing Section 230 would solve a wide range of problems relating to internet governance. These critics also have played down the potentialy dire consequences that repeal would have for smaller internet companies. Academics, think tank researchers, and others outside of government have made a variety of more nuanced proposals for revising the law. We assess these ideas with an eye toward recommending and integrating the most promising ones. Our conclusion is that Section 230 ought to be preserved—but that it can be improved…(More)”

How to See What the World Is Teaching Us About COVID-19


Essay by Karabi Acharya: “At the global learning team at the Robert Wood Johnson Foundation, it has been exciting to see people looking at what the world can teach us, whether that be how China is handling COVID-19, South Korea’s drive-through testing, or New Zealand’s elimination of the virus under Jacinda Ardern’s leadership. Yet in a survey conducted by Candid in early 2020 of foundations located in the US, 73 percent of respondents reported that their domestic grantmaking was rarely or not at all informed or inspired by ideas and solutions from around the globe and beyond US borders. 

These practices may be shifting. Those of us working in philanthropy, government, and social change are trying to learn as much about COVID-19 as possible, and that naturally includes looking abroad. Yet what will we actually see when we do? Too often, our vision is obscured by bias, and as we try to distinguish news from noise, good intentions are often not enough. We must ask ourselves critical questions, and train ourselves to overcome our biases.

Here are four ways to see the world in a new light, as we look to come out of the pandemic’s darkness:

Seeing Beyond the Familiar

COVID-19 has no borders and the same with good ideas. But too often, we are limited by what has been called the “country of origin effect,” a psychological effect in which people understand the quality and relevance of an object or idea by the country it comes from. In short, we tend to look for ideas from countries that are demographically, culturally, economically, or politically similar to us. In the US, this can mean we overvalue learning from Europe and undervalue learning from low- and middle-income countries.

Yet countries like Nigeria have much to teach us about contract tracing and mitigation from their experience eradicating the Ebola outbreak, just as Ghana’s innovative testing and taxation policies (including a three month tax holiday for health care workers) are balancing protecting health and the economy. In example after example of necessity being the mother of invention, African nations are leading the way in innovation: developing low-cost tests for under $1, using zipline drones to transport the tests to testing sites, and leveraging its cashless digital payment infrastructure to facilitate social distancing. Another often-ignored source of inspiration are Indigenous cultural practices, where ideas and practices centered around collective well-being can be instructive for us as we tackle issues of inequity arising out of COVID…

In other words, we look to other countries with the hope that doing so will change how we see our own, opening our imaginations to new ideas, solutions, and futures. This is only possible if we can overcome our biases that impair our ability to see the solutions around us. 

COVID-19 will be studied for generations to come. But what the world will learn will depend on what we were able to see today. Did we seek out solutions from every corner of the world? Did we bring on the journey those who would most benefit from what the world had to offer? Did we recognize the underlying conditions that exacerbate inequity or help overcome it? Was our imagination strong enough to see how we can create the kind of society that allows everyone the opportunity to live healthy and happier lives?…(More)”

UK’s National Data Strategy


DCMS (UK): “…With the increasing ascendance of data, it has become ever-more important that the government removes the unnecessary barriers that prevent businesses and organisations from accessing such information.

The importance of data sharing was demonstrated during the first few months of the coronavirus pandemic, when government departments, local authorities, charities and the private sector came together to provide essential services. One notable example is the Vulnerable Person Service, which in a very short space of time enabled secure data-sharing across the public and private sectors to provide millions of food deliveries and access to priority supermarket delivery slots for clinically extremely vulnerable people.

Aggregation of data from different sources can also lead to new insights that otherwise would not have been possible. For example, the Connected Health Cities project anonymises and links data from different health and social care services, providing new insights into the way services are used.

Vitally, data sharing can also fuel growth and innovation.20 For new and innovating organisations, increasing data availability will mean that they, too, will be able to gain better insights from their work and access new markets – from charities able to pool beneficiary data to better evaluate the effectiveness of interventions, to new entrants able to access new markets. Often this happens as part of commercial arrangements; in other instances government has sought to intervene where there are clear consumer benefits, such as in relation to Open Banking and Smart Data. Government has also invested in the research and development of new mechanisms for better data sharing, such as the Office for AI and Innovate UK’s partnership with the Open Data Institute to explore data trusts.21

However, our call for evidence, along with engagement with stakeholders, has identified a range of barriers to data availability, including:

  • a culture of risk aversion
  • issues with current licensing regulations
  • market barriers to greater re-use, including data hoarding and differential market power
  • inconsistent formatting of public sector data
  • issues pertaining to the discoverability of data
  • privacy and security concerns
  • the benefits relating to increased data sharing not always being felt by the organisation incurring the cost of collection and maintenance

This is a complex environment, and heavy-handed intervention may have the unwanted effect of reducing incentives to collect, maintain and share data for the benefit of the UK. It is clear that any way forward must be carefully considered to avoid unintended negative consequences. There is a balance to be struck between maintaining appropriate commercial incentives to collect data, while ensuring that data can be used widely for the benefit of the UK. For personal data, we must also take account of the balance between individual rights and public benefit.

This is a new issue for all digital economies that has come to the fore as data has become a significant modern, economic asset. Our approach will take account of those incentives, and consider how innovation can overcome perceived barriers to availability. For example, it can be limited to users with specific characteristics, by licence or regulator accreditation; it can be shared within a collaborating group of organisations; there may also be value in creating and sharing synthetic data to support research and innovation, as well as other privacy-enhancing technologies and techniques….(More)”.

Law and Technology Realism


Paper by Thibault Schrepel: “One may identify two current trends in the field of “Law and Technology.” The first trend concerns technological determinism. Some argue that technology is deterministic: the state of technological advancement is the determining factor of society. Others oppose that view, claiming it is the society that affects technology. The second trend concerns technological neutrality. some say that technology is neutral, meaning the effects of technology depend entirely on the social context. Others defend the opposite: they view the effects of technology as being inevitable (regardless of the society in which it is used).

<p><em><strong>Figure 1</strong></em></p>
Figure 1

While it is commonly accepted that technology is deterministic, I am under the impression that a majority of “Law and Technology” scholars also believe that technology is non-neutral. It follows that, according to this dominant view, (1) technology drives society in good or bad directions (determinism), and that (2) certain uses of technology may lead to the reduction or enhancement of the common good (non-neutrality). Consequently, this leads to top-down tech policies where the regulator has the impossible burden of helping society control and orient technology to the best possible extent.

This article is deterministic and non-neutral.

But, here’s the catch. Most of today’s doctrine focuses almost exclusively on the negativity brought by technology (read Nick Bostrom, Frank Pasquale, Evgeny Morozov). Sure, these authors mention a few positive aspects, but still end up focusing on the negative ones. They’re asking to constrain technology on that sole basis. With this article, I want to raise another point: technology determinism can also drive society by providing solutions to centuries-old problems. In and of itself. This is not technological solutionism, as I am not arguing that technology can solve all of mankind’s problems, but it is not anti-solutionism either. I fear the extremes, anyway.

To make my point, I will discuss the issue addressed by Albert Hirschman in his famous book Exit, Voice, and Loyalty (Harvard University Press, 1970). Hirschman, at the time Professor of Economics at Harvard University, introduces the distinction between “exit” and “voice.” With exit, an individual exhibits her or his disagreement as a member of a group by leaving the group. With voice, the individual stays in the group but expresses her or his dissatisfaction in the hope of changing its functioning. Hirschman summarizes his theory on page 121, with the understanding that the optimal situation for any individual is to be capable of both “exit” and “voice“….(More)”.

How Competition Impacts Data Privacy


Paper by Aline Blankertz: “A small number of large digital platforms increasingly shape the space for most online interactions around the globe and they often act with hardly any constraint from competing services. The lack of competition puts those platforms in a powerful position that may allow them to exploit consumers and offer them limited choice. Privacy is increasingly considered one area in which the lack of competition may create harm. Because of these concerns, governments and other institutions are developing proposals to expand the scope for competition authorities to intervene to limit the power of the large platforms and to revive competition.  


The first case that has explicitly addressed anticompetitive harm to privacy is the German Bundeskartellamt’s case against Facebook in which the authority argues that imposing bad privacy terms can amount to an abuse of dominance. Since that case started in 2016, more cases deal with the link between competition and privacy. For example, the proposed Google/Fitbit merger has raised concerns about sensitive health data being merged with existing Google profiles and Apple is under scrutiny for not sharing certain personal data while using it for its own services.

However, addressing bad privacy outcomes through competition policy is effective only if those outcomes are caused, at least partly, by a lack of competition. Six distinct mechanisms can be distinguished through which competition may affect privacy, as summarized in Table 1. These mechanisms constitute different hypotheses through which less competition may influence privacy outcomes and lead either to worse privacy in different ways (mechanisms 1-5) or even better privacy (mechanism 6). The table also summarizes the available evidence on whether and to what extent the hypothesized effects are present in actual markets….(More)”.

The Potential of Open Digital Ecosystems


About: “Omidyar Network India (ONI), in partnership with Boston Consulting Group (BCG), has undertaken a study to reimagine digital platforms for the public good, with the aim build a shared narrative around digital platforms and develop a holistic roadmap to foster their systematic adoption.

This study has especially benefited from collaboration with the Ministry of Electronics and Information Technology (MeitY), Government of India. It builds on the thinking presented in the public consultation whitepaper on ‘Strategy for National Open Digital Ecosystems (NODEs)’ published by MeitY in February 2020, to which ONI and BCG have contributed.

This website outlines the key findings of the study and introduces a new paradigm, i.e. ODEs, which recognizes the importance of a strong governance framework as well as the community of stakeholders that make them effective….(More)”.