GDP is getting a makeover — what it means for economies, health and the planet


Article by Ehsan Masood: “The numbers are heading in the wrong direction. If the world continues on its current track, it will fall well short of achieving almost all of the 17 Sustainable Development Goals (SDGs) that the United Nations set to protect the environment and end poverty and inequality by 2030.

The projected grade for:

Eliminating hunger: F.

Ensuring healthy lives for all: F.

Protecting and sustainably using ocean resources: F.

The trends were there before 2020, but then problems increased with the COVID-19 pandemic, war in Ukraine and the worsening effects of climate change. The world is in “a new uncertainty complex”, says economist Pedro Conceição, lead author of the United Nations Human Development Report.

One measure of this is the drastic change in the Human Development Index (HDI), which combines educational outcomes, income and life expectancy into a single composite indicator. After 2019, the index has fallen for two successive years for the first time since its creation in 1990. “I don’t think this is a one-off, or a blip. I think this could be a new reality,” Conceição says.

UN secretary-general António Guterres is worried. “We need an urgent rescue effort for the SDGs,” he wrote in the foreword to the latest progress report, published in July. Over the past year, Guterres and the heads of big UN agencies, such as the Statistics Division and the UN Development Programme, have been assessing what’s gone wrong and what needs to be done. They’re converging on the idea that it’s time to stop using gross domestic product (GDP) as the world’s main measure of prosperity, and to complement it with a dashboard of indicators, possibly ones linked to the SDGs. If this happens, it would be the biggest shift in how economies are measured since nations first started using GDP in 1953, almost 70 years ago.

Guterres’s is the latest in a crescendo of voices calling for GDP to be dropped as the world’s primary go-to indicator, and for a dashboard of metrics instead. In 2008, then French president Nicolas Sarkozy endorsed such a call from a team of economists, including Nobel laureates Amartya Sen and Joseph Stiglitz.

And in August, the White House announced a 15-year plan to develop a new summary statistic that would show how changes to natural assets — the natural wealth on which economies depend — affect GDP. The idea, according to the project’s main architect, economist Eli Fenichel at the White House Office of Science and Technology Policy, is to help society to determine whether today’s consumption is being accomplished without compromising the future opportunities that nature provides. “GDP only gives a partial and — for many common uses — an incomplete, picture of economic progress,” Fenichel says.

The fact that Guterres has made this a priority, amid so many major crises, is a sign that “going beyond GDP has been picked up at the highest level”, says Stefan Schweinfest, the director of the UN Statistics Division, based in New York City…(More)”.

Wicked Problems Might Inspire Greater Data Sharing


Paper by Susan Ariel Aaronson: “In 2021, the United Nations Development Program issued a plea in their 2021 Digital Economy Report. “ Global data-sharing can help address major global development challenges such as poverty, health, hunger and climate change. …Without global cooperation on data and information, research to develop the vaccine and actions to tackle the impact of the pandemic would have been a much more difficult task. Thus, in the same way as some data can be public goods, there is a case for some data to be considered as global public goods, which need to be addressed and provided through global governance.” (UNDP: 2021, 178). Global public goods are goods and services with benefits and costs that potentially extend to all countries, people, and generations. Global data sharing can also help solve what scholars call wicked problems—problems so complex that they require innovative, cost effective and global mitigating strategies. Wicked problems are problems that no one knows how to solve without
creating further problems. Hence, policymakers must find ways to encourage greater data sharing among entities that hold large troves of various types of data, while protecting that data from theft, manipulation etc. Many factors impede global data sharing for public good purposes; this analysis focuses on two.
First, policymakers generally don’t think about data as a global public good; they view data as a commercial asset that they should nurture and control. While they may understand that data can serve the public interest, they are more concerned with using data to serve their country’s economic interest. Secondly, many leaders of civil society and business see the data they have collected as proprietary data. So far many leaders of private entities with troves of data are not convinced that their organization will benefit from such sharing. At the same time, companies voluntarily share some data for social good purposes.

However, data cannot meet its public good purpose if data is not shared among societal entities. Moreover, if data as a sovereign asset, policymakers are unlikely to encourage data sharing across borders oriented towards addressing shared problems. Consequently, society will be less able to use data as both a commercial asset and as a resource to enhance human welfare. As the Bennet Institute and ODI have argued, “value comes from data being brought together, and that requires organizations to let others use the data they hold.” But that also means the entities that collected the data may not accrue all of the benefits from that data (Bennett Institute and ODI: 2020a: 4). In short, private entities are not sufficiently incentivized to share data in the global public good…(More)”.

Accelerating Government Innovation With Leadership and Stimulus Funding


Paper by Jane Wiseman: “With the evolving maturity of innovation offices and digital teams comes the imperative for leaders and managers to provide pathways for these organizations to succeed and work together effectively, in terms of embracing new ideas and scaling those that prove effective beyond a prototype or pilot. The availability of a large, one-time infusion of federal funds to support state and local services and programs through the American Rescue Plan Act, the Infrastructure law, and other recent laws provides State and local leaders with a unique opportunity to collaborate with their federal partners and promote innovation that improves the lives of their people. Data and innovation teams can help government be more efficient and effective in spending stimulus funds at the state and local level in the coming years.

In this new report, Jane Wiseman explores various ways that executives can leverage stimulus funding to incentivize success across multiple innovation and data roles, drive forward work from those roles into digital service development and delivery. Through close examination of multiple cases in the field, the author develops a framework with specific recommendations for how leaders can drive opportunities for innovators to complement each other to the benefit of public good, including key skills or characteristics that correlate to success.

This report is intended to help leaders of current government innovation groups, including chief data officers, chief digital officers, innovation team leaders, and similar groups, to learn from successful models that they can apply directly to their operations to be more effective. The report also provides lessons and recommendation for senior executives in government, such as a cabinet secretary, governor, county executive or mayor, to help them think through the possible models of effective practices to support the range of innovation roles, define success…(More)”.

Measuring Small Business Dynamics and Employment with Private-Sector Real-Time Data


Paper by André Kurmann, Étienne Lalé and Lien Ta: “The COVID-19 pandemic has led to an explosion of research using private-sector datasets to measure business dynamics and employment in real-time. Yet questions remain about the representativeness of these datasets and how to distinguish business openings and closings from sample churn – i.e., sample entry of already operating businesses and sample exits of businesses that continue operating. This paper proposes new methods to address these issues and applies them to the case of Homebase, a real-time dataset of mostly small service-sector sector businesses that has been used extensively in the literature to study the effects of the pandemic. We match the Homebase establishment records with information on business activity from Safegraph, Google, and Facebook to assess the representativeness of the data and to estimate the probability of business closings and openings among sample exits and entries. We then exploit the high frequency / geographic detail of the data to study whether small service-sector businesses have been hit harder by the pandemic than larger firms, and the extent to which the Paycheck Protection Program (PPP) helped small businesses keep their workforce employed. We find that our real-time estimates of small business dynamics and employment during the pandemic are remarkably representative and closely fit population counterparts from administrative data that have recently become available. Distinguishing business closings and openings from sample churn is critical for these results. We also find that while employment by small businesses contracted more severely in the beginning of the pandemic than employment of larger businesses, it also recovered more strongly thereafter. In turn, our estimates suggests that the rapid rollout of PPP loans significantly mitigated the negative employment effects of the pandemic. Business closings and openings are a key driver for both results, thus underlining the importance of properly correcting for sample churn…(More)”.

Nudging Consumers to Purchase More Sustainably


Article by Erez Yoeli: “Most consumers still don’t choose sustainable products when the option is available. Americans may claim to be willing to pay more for green energy, but while green energy is available in the majority of states — 35 out of 50 states or roughly 80% of American households as of 2018, at least — only 14% of households were even aware of the green option, and less than half of these households purchased it. Hybrids and electric vehicles are available nationwide, but still amount to just 10% of sales — 6.6% and 3.4%, respectively, according to S&P Global’s subscription services.

Now it may be that this virtue thinking-doing gap will eventually close. I hope so. But it will certainly need help, because in these situations there’s often an insidious behavioral dynamic at work that often stops stated good intentions from turning into actual good deeds…

Allow me to illustrate what I mean by “the plausible deniability effect” with an example from a now-classic behavioral economics study. Every year, around the holidays, Salvation Army volunteers collect donations for the needy outside supermarkets and other retail outlets. Researchers Justin Rao, Jim Andreoni, and Hanna Trachtmann teamed up with a Boston chapter of the Salvation Army to test ways of increasing donations.

Taking a supermarket that had two exit/entry points, the team randomly divided the volunteers into two groups. In one group, just one volunteer was assigned to stand in front of one door. For the other group, volunteers were stationed at both doors…(More)”.

Closing the Data Divide for a More Equitable U.S. Digital Economy


Report by Gillian Diebold: “In the United States, access to many public and private services, including those in the financial, educational, and health-care sectors, are intricately linked to data. But adequate data is not collected equitably from all Americans, creating a new challenge: the data divide, in which not everyone has enough high-quality data collected about them or their communities and therefore cannot benefit from data-driven innovation. This report provides an overview of the data divide in the United States and offers recommendations for how policymakers can address these inequalities…(More)”.

Sustaining Open Data as a Digital Common — Design principles for Common Pool Resources applied to Open Data Ecosystems


Paper by Johan Linåker, and Per Runeson: “Digital commons is an emerging phenomenon and of increasing importance, as we enter a digital society. Open data is one example that makes up a pivotal input and foundation for many of today’s digital services and applications. Ensuring sustainable provisioning and maintenance of the data, therefore, becomes even more important.

We aim to investigate how such provisioning and maintenance can be collaboratively performed in the community surrounding a common. Specifically, we look at Open Data Ecosystems (ODEs), a type of community of actors, openly sharing and evolving data on a technological platform.

We use Elinor Ostrom’s design principles for Common Pool Resources as a lens to systematically analyze the governance of earlier reported cases of ODEs using a theory-oriented software engineering framework.

We find that, while natural commons must regulate consumption, digital commons such as open data maintained by an ODE must stimulate both use and data provisioning. Governance needs to enable such stimulus while also ensuring that the collective action can still be coordinated and managed within the frame of available maintenance resources of a community. Subtractability is, in this sense, a concern regarding the resources required to maintain the quality and value of the data, rather than the availability of data. Further, we derive empirically-based recommended practices for ODEs based on the design principles by Ostrom for how to design a governance structure in a way that enables a sustainable and collaborative provisioning and maintenance of the data.

ODEs are expected to play a role in data provisioning which democratize the digital society and enables innovation from smaller commercial actors. Our empirically based guidelines intend to support this development…(More).

What is the value of data? A review of empirical methods


Policy brief by Diane Coyle and Annabel Manley: “The economy has been transformed by data in recent years. Data-driven firms made up seven of the global top 10 firms by stock market capitalisation in 2021; and across the OECD (Organisation for Economic Co-operation and Development) economies there has been a growing gap in terms of productivity and profitability between firms that use data intensively and the rest (e.g. Brynjolfsson et al 2019; Bajgar et al 2022; Coyle et al 2022). The widespread availability of data and analytics has also begun to extend into the public sector and policymaking, for example with ‘following the science’ – implying intense use of data – becoming a tagline for the handling of the COVID-19 pandemic in the UK and elsewhere.

It is therefore obvious that data has value in an economically meaningful sense. The sources of its value and characteristics of data as an economic asset are discussed at length in our earlier Value of Data report (Coyle et al 2020a). We concluded that there is potential value to the economy as a whole from having the ability to use data, and not just to the organisations that control specific data sets. This appreciation is increasingly reflected in many policy statements of data strategy and the broader debate about the governance of data (e.g. European Parliament 2022). The value of data is also explicitly and implicitly acknowledged by firms that sell data services, and investors who take dataset assets into account in stock market valuations or mergers and acquisitions.

However, despite the broad recognition of its value, and the need to develop appropriate policy frameworks, there is still no consensus method for empirically determining the value of data. Without this, the full potential will not be realised (Verhulst 2018). There are not even many examples of markets for data that would indicate a private valuation (although not the wider social value). Yet estimates of the value of data are needed to determine an appropriate level of investment, as well as a better understanding of how data can contribute value to the economy and how to govern the collection and use of different types of data.

This brief presents an overview of a range of alternative methods for data valuation, including those proposed in the existing literature. This includes some relatively widely used methods and others that are more specialist or preliminary…(More)”.

Designing Data Spaces: The Ecosystem Approach to Competitive Advantage


Open access book edited by Boris Otto, Michael ten Hompel, and Stefan Wrobel: “…provides a comprehensive view on data ecosystems and platform economics from methodical and technological foundations up to reports from practical implementations and applications in various industries.

To this end, the book is structured in four parts: Part I “Foundations and Contexts” provides a general overview about building, running, and governing data spaces and an introduction to the IDS and GAIA-X projects. Part II “Data Space Technologies” subsequently details various implementation aspects of IDS and GAIA-X, including eg data usage control, the usage of blockchain technologies, or semantic data integration and interoperability. Next, Part III describes various “Use Cases and Data Ecosystems” from various application areas such as agriculture, healthcare, industry, energy, and mobility. Part IV eventually offers an overview of several “Solutions and Applications”, eg including products and experiences from companies like Google, SAP, Huawei, T-Systems, Innopay and many more.

Overall, the book provides professionals in industry with an encompassing overview of the technological and economic aspects of data spaces, based on the International Data Spaces and Gaia-X initiatives. It presents implementations and business cases and gives an outlook to future developments. In doing so, it aims at proliferating the vision of a social data market economy based on data spaces which embrace trust and data sovereignty…(More)”.

Meet the new GDP prototype that tracks inequality


Article by Greg Rosalsky: “…Nearly a century after Kuznets pioneered the use of GDP, economists Thomas Blanchet, Emmanuel Saez, and Gabriel Zucman are trying to revolutionize it. In a new paper titled “Real-Time Inequality,” the economists imagine a new kind of GDP, one that isn’t merely a single number telling us about total economic growth, but a collection of numbers telling us where the gains from this growth are flowing. They already have a working prototype that they’ve published online, and it can provide some important insights about our economy right now…

Gabriel Zucman is an economist at UC Berkeley and the director of the James M. and Cathleen D. Stone Center on Wealth and Income Inequality. He has been working to transform government economic statistics — also called “national accounts” — for almost a decade. He says the national accounts offer the public valuable insights about economic growth. However, Zucman says, “The big problem is these data do not tell you who is benefiting from economic growth.”

America, of course, already has tons of data on inequality. The problem, Zucman says, is it usually takes a year or two for this data to be updated. “It’s not enough to come in two years after the policy battle, and say, ‘Look, this is what happened to inequality,'” Zucman says. “That’s too late.”

Their new project is an effort to fix this. Cobbling together data from a variety of official sources, Zucman and his colleagues have pioneered a method to compute in a more timely fashion how different income groups — like the working class and the middle class — are doing economically. They hope this prototype will inspire the federal government to follow suit and soon “produce numbers about how income is growing for each social group at the exact time when the Bureau of Economic Analysis releases its official GDP growth numbers.”

Zucman envisions a future where this data could inform and shape policy decisions. When considering policies like sending stimulus checks or providing tax relief, Zucman says, policymakers and voters need to know things like “which groups need more support, or whether the government may be actually overshooting, which might lead to inflation.”…(More)”.