The GDPR effect: How data privacy regulation shaped firm performance globally


Paper by Carl Benedikt Frey and Giorgio Presidente:  “…To measure companies’ exposure to GDPR, we exploit international input-output tables and compute the shares of output sold to EU markets for each country and 2-digit industry. We then construct a shift-share instrument interacting this share with a dummy variable taking the value one from 2018 onwards.

Based on this approach, we find both channels discussed above to be quantitatively important, though the cost channel consistently dominates. On average, across our full sample, companies targeting EU markets saw an 8% reduction in profits and a relatively modest 2% decrease in sales (Figure 1). This suggests that earlier studies, which have focused on online outcomes or proxies of sales, provide an incomplete picture since companies have primarily been adversely affected through surging compliance costs. 

While systematic data on firms’ IT purchases are hard to come by, we can explore how companies developing digital technologies have responded to GDPR. Indeed, taking a closer look at some recent patent documents, we note that these include applications for technologies like a “system and method for providing general data protection regulation (GDPR) compliant hashing in blockchain ledgers”, which guarantees a user’s right to be forgotten. Another example is a ‘Data Consent Manager’, a computer-implemented method for managing consent for sharing data….

While the results reported above show that GDPR has reduced firm performance on average, they do not reveal how different types of firms have been affected. As is well-known, large companies have more technical and financial resources to comply with regulations (Brill 2011), invest more in lobbying (Bombardini 2008), and might be better placed to obtain consent for personal data processing from individual consumers (Goldfarb and Tucker 2011). For example, Facebook has reportedly hired some 1,000 engineers, managers, and lawyers globally in response to the new regulation. It also doubled its EU lobbying budget in 2017 on the previous year, when GDPR was announced. Indeed, according to LobbyFacts.eu, Google, Facebook and Apple now rank among the five biggest corporate spenders on lobbying in the EU, with annual budgets in excess of €3.5 million.

While these are significant costs that might reduce profits, the impact of the GDPR on the fortunes of big tech is ambiguous. As The New York Times writes, “Whether Europe’s tough approach is actually crimping the global tech giants is unclear… Amazon, Apple, Google and Facebook have continued to grow and add customers”. Indeed, by being better able to cope with the burdens of the regulation, these companies may have increased their market share at the expense of smaller companies (Johnson et al. 2020, Peukert et al. 2020). …(More)”.

Bringing Open Source to the Global Lab Bench


Article by Julieta Arancio and Shannon Dosemagen: “In 2015, Richard Bowman, an optics scientist, began experimenting with 3D printing a microscope as a single piece in order to reduce the time and effort of reproducing the design. Soon after, he started the OpenFlexure project, an open-license 3D-printed microscope. The project quickly took over his research agenda and grew into a global community of hundreds of users and developers, including professional scientists, hobbyists, community scientists, clinical researchers, and teachers. Anyone with access to a 3D printer can download open-source files from the internet to create microscopes that can be used for doing soil science research, detecting diseases such as malaria, or teaching microbiology, among other things. Today, the project is supported by a core team at the Universities of Bath and Cambridge in the United Kingdom, as well as in Tanzania by the Ifakara Health Institute and Bongo Tech & Research Labs, an engineering company. 

OpenFlexure is one of many open science hardware projects that are championed by the Gathering for Open Science Hardware (GOSH), a transnational network of open science hardware advocates. Although there are differences in practice, open hardware projects operate on similar principles to open-source software, and they span disciplines ranging from nanotechnology to environmental monitoring. GOSH defines the field as “any piece of hardware used for scientific investigations that can be obtained, assembled, used, studied, modified, shared, and sold by anyone. It includes standard lab equipment as well as auxiliary materials, such as sensors, biological reagents, analog and digital electronic components.” Compared to an off-the-shelf microscope, which may cost thousands of dollars, an OpenFlexure microscope may cost a few hundred. By being significantly cheaper and easier to maintain, open hardware enables more people in more places to do science….(More)”.

An EU Strategy on Standardisation


Press Release: “Today, the Commission is presenting a new Standardisation Strategy outlining our approach to standards within the Single Market as well as globally. The Strategy is accompanied by a proposal for an amendment to the Regulation on standardisation, a report on its implementation, and the 2022 annual Union work programme for European standardisation. This new Strategy aims to strengthen the EU’s global competitiveness, to enable a resilient, green and digital economy and to enshrine democratic values in technology applications.

Standards are the silent foundation of the EU Single Market and global competitiveness. They help manufacturers ensure the interoperability of products and services, reduce costs, improve safety and foster innovation. Standards are an invisible but fundamental part of our daily life: from Wi-Fi frequencies, to connected toys or ski bindings, just to mention a few. Standards give confidence that a product or a service is fit for purpose, is safe and will not harm people or the environment. Compliance with harmonised standards guarantees that products are in line with EU law.

The fast pace of innovation, our green and digital ambitions and the implications of technological standards for our EU democratic values require an increasingly strategic approach to standardisation. The EU’s ambitions towards a climate neutral, resilient and circular economy cannot be delivered without European standards. Having a strong global footprint in standardisation activities and leading the work in key international fora and institutions will be essential for the EU to remain a global standard-setter. By setting global standards, the EU exports its values while providing EU companies with an important first-mover advantage.

Executive Vice-President for a Europe Fit for the Digital Age, Margrethe Vestager, said: “Ensuring that data is protected in artificial intelligence or ensuring that mobile devices are secure from hacking, rely on standards and must be in line with EU democratic values. In the same way, we need standards for the roll-out of important investment projects, like hydrogen or batteries, and to valorise innovation investment by providing EU companies with an important first-mover advantage.”…(More)”.

Leveraging Non-Traditional Data For The Covid-19 Socioeconomic Recovery Strategy


Article by Deepali Khanna: “To this end, it is opportune to ask the following questions: Can we harness the power of data routinely collected by companies—including transportation providers, mobile network operators, social media networks and others—for the public good? Can we bridge the data gap to give governments access to data, insights and tools that can inform national and local response and recovery strategies?

There is increasing recognition that traditional and non-traditional data should be seen as complementary resources. Non-traditional data can bring significant benefits in bridging existing data gaps but must still be calibrated against benchmarks based on established traditional data sources. These traditional datasets are widely seen as reliable as they are subject to established stringent international and national standards. However, they are often limited in frequency and granularity, especially in low- and middle-income countries, given the cost and time required to collect such data. For example, official economic indicators such as GDP, household consumption and consumer confidence may be available only up to national or regional level with quarterly updates…

In the Philippines, UNDP, with support from The Rockefeller Foundation and the government of Japan, recently setup the Pintig Lab: a multidisciplinary network of data scientists, economists, epidemiologists, mathematicians and political scientists, tasked with supporting data-driven crisis response and development strategies. In early 2021, the Lab conducted a study which explored how household spending on consumer-packaged goods, or fast-moving consumer goods (FMCGs), can been used to assess the socioeconomic impact of Covid-19 and identify heterogeneities in the pace of recovery across households in the Philippines. The Philippine National Economic Development Agency is now in the process of incorporating this data for their GDP forecasting, as additional input to their predictive models for consumption. Further, this data can be combined with other non-traditional datasets such as credit card or mobile wallet transactions, and machine learning techniques for higher-frequency GDP nowcasting, to allow for more nimble and responsive economic policies that can both absorb and anticipate the shocks of crisis….(More)”.

How digital transformation is driving economic change


Blog (and book) by Zia Qureshi: “We are living in a time of exciting technological innovations. Digital technologies are driving transformative change. Economic paradigms are shifting. The new technologies are reshaping product and factor markets and profoundly altering business and work. The latest advances in artificial intelligence and related innovations are expanding the frontiers of the digital revolution. Digital transformation is accelerating in the wake of the COVID-19 pandemic. The future is arriving faster than expected.

A recently published book, “Shifting Paradigms: Growth, Finance, Jobs, and Inequality in the Digital Economy,” examines the implications of the unfolding digital metamorphosis for economies and public policy agendas….

Firms at the technological frontier have broken away from the rest, acquiring dominance in increasingly concentrated markets and capturing the lion’s share of the returns from the new technologies. While productivity growth in these firms has been strong, it has stagnated or slowed in other firms, depressing aggregate productivity growth. Increasing automation of low- to middle-skill tasks has shifted labor demand toward higher-level skills, hurting wages and jobs at the lower end of the skill spectrum. With the new technologies favoring capital, winner-take-all business outcomes, and higher-level skills, the distribution of both capital and labor income has tended to become more unequal, and income has been shifting from labor to capital.

One important reason for these outcomes is that policies and institutions have been slow to adjust to the unfolding transformations. To realize the promise of today’s smart machines, policies need to be smarter too. They must be more responsive to change to fully capture potential gains in productivity and economic growth and address rising inequality as technological disruptions create winners and losers.

As technology reshapes markets and alters growth and distributional dynamics, policies must ensure that markets remain inclusive and support wide access to the new opportunities for firms and workers. The digital economy must be broadened to disseminate new technologies and opportunities to smaller firms and wider segments of the labor force…(More)”.

GDP’s Days Are Numbered


Essay by Diane Coyle: “How should we measure economic success? Criticisms of conventional indicators, particularly gross domestic product, have abounded for years, if not decades. Environmentalists have long pointed out that GDP omits the depletion of natural assets, as well as negative externalities such as global warming. And its failure to capture unpaid but undoubtedly valuable work in the home is another glaring omission. But better alternatives may soon be at hand.

In 2009, a commission led by Joseph StiglitzAmartya Sen, and Jean-Paul Fitoussi spurred efforts to find alternative ways to gauge economic progress by recommending a “dashboard” of indicators. Since then, economists and statisticians, working alongside natural scientists, have put considerable effort into developing rigorous wealth-based prosperity metrics, particularly concerning natural assets. The core idea is to create a comprehensive national balance sheet to demonstrate that economic progress today is illusory when it comes at the expense of future living standards.

In an important milestone in March of this year, the United Nations approved a statistical standard relating to the services that nature provides to the economy. That followed the UK Treasury’s publication of a review by the University of Cambridge’s Partha Dasgupta setting out how to integrate nature in general, and biodiversity in particular, into economic analysis. With the consequences of climate change starting to become all too apparent, any meaningful concept of economic success in the future will surely include sustainability.

The next steps in this statistical endeavor will be to incorporate measures of social capital, reflecting the ability of communities or countries to act collectively, and to extend measurement of the household sector. The COVID-19 pandemic has highlighted how crucial this unpaid work is to a country’s economic health. For example, the US Bureau of Labor Statistics intends to develop a more comprehensive concept of living standards that includes the value of such activity….(More)”.

Improving Consumer Welfare with Data Portability


Report by Daniel Castro: “Data protection laws and regulations can contain restrictive provisions, which limit data sharing and use, as well as permissive provisions, which increase it. Data portability is an example of a permissive provision that allows consumers to obtain a digital copy of their personal information from an online service and provide this information to other services. By carefully crafting data portability provisions, policymakers can enable consumers to obtain more value from their data, create new opportunities for businesses to innovate with data, and foster competition….(More)”.

Articulating Value from Data


Report by the World Economic Forum: “The distinct characteristics and dynamics of data – contextual, relational and cumulative – call for new approaches to articulating its value. Businesses should value data based on cases that go beyond the transactional monetization of data and take into account the broader context, future opportunities to collaborate and innovate, and value created for its ecosystem stakeholders. Doing so will encourage companies to think about the future value data can help generate, beyond the existing data lakes they sit on, and open them up to collaboration opportunities….(More)”.

Strengthening international cooperation on AI


Report by Cameron F. Kerry, Joshua P. Meltzer, Andrea Renda, Alex Engler, and Rosanna Fanni: “Since 2017, when Canada became the first country to adopt a national AI strategy, at least 60 countries have adopted some form of policy for artificial intelligence (AI). The prospect of an estimated boost of 16 percent, or US$13 trillion, to global output by 2030 has led to an unprecedented race to promote AI uptake across industry, consumer markets, and government services. Global corporate investment in AI has reportedly reached US$60 billion in 2020 and is projected to more than double by 2025.

At the same time, the work on developing global standards for AI has led to significant developments in various international bodies. These encompass both technical aspects of AI (in standards development organizations (SDOs) such as the International Organization for Standardization (ISO), the International Electrotechnical Commission (IEC), and the Institute of Electrical and Electronics Engineers (IEEE) among others) and the ethical and policy dimensions of responsible AI. In addition, in 2018 the G-7 agreed to establish the Global Partnership on AI, a multistakeholder initiative working on projects to explore regulatory issues and opportunities for AI development. The Organization for Economic Cooperation and Development (OECD) launched the AI Policy Observatory to support and inform AI policy development. Several other international organizations have become active in developing proposed frameworks for responsible AI development.

In addition, there has been a proliferation of declarations and frameworks from public and private organizations aimed at guiding the development of responsible AI. While many of these focus on general principles, the past two years have seen efforts to put principles into operation through fully-fledged policy frameworks. Canada’s directive on the use of AI in government, Singapore’s Model AI Governance Framework, Japan’s Social Principles of Human-Centric AI, and the U.K. guidance on understanding AI ethics and safety have been frontrunners in this sense; they were followed by the U.S. guidance to federal agencies on regulation of AI and an executive order on how these agencies should use AI. Most recently, the EU proposal for adoption of regulation on AI has marked the first attempt to introduce a comprehensive legislative scheme governing AI.

In exploring how to align these various policymaking efforts, we focus on the most compelling reasons for stepping up international cooperation (the “why”); the issues and policy domains that appear most ready for enhanced collaboration (the “what”); and the instruments and forums that could be leveraged to achieve meaningful results in advancing international AI standards, regulatory cooperation, and joint R&D projects to tackle global challenges (the “how”). At the end of this report, we list the topics that we propose to explore in our forthcoming group discussions….(More)”

Data Science for Social Good: Philanthropy and Social Impact in a Complex World


Book edited by Ciro Cattuto and Massimo Lapucci: “This book is a collection of insights by thought leaders at first-mover organizations in the emerging field of “Data Science for Social Good”. It examines the application of knowledge from computer science, complex systems, and computational social science to challenges such as humanitarian response, public health, and sustainable development. The book provides an overview of scientific approaches to social impact – identifying a social need, targeting an intervention, measuring impact – and the complementary perspective of funders and philanthropies pushing forward this new sector.

TABLE OF CONTENTS


Introduction; By Massimo Lapucci

The Value of Data and Data Collaboratives for Good: A Roadmap for Philanthropies to Facilitate Systems Change Through Data; By Stefaan G. Verhulst

UN Global Pulse: A UN Innovation Initiative with a Multiplier Effect; By Dr. Paula Hidalgo-Sanchis

Building the Field of Data for Good; By Claudia Juech

When Philanthropy Meets Data Science: A Framework for Governance to Achieve Data-Driven Decision-Making for Public Good; By Nuria Oliver

Data for Good: Unlocking Privately-Held Data to the Benefit of the Many; By Alberto Alemanno

Building a Funding Data Ecosystem: Grantmaking in the UK; By Rachel Rank

A Reflection on the Role of Data for Health: COVID-19 and Beyond; By Stefan E. Germann and Ursula Jasper….(More)”