Accelerating Government Innovation With Leadership and Stimulus Funding


Paper by Jane Wiseman: “With the evolving maturity of innovation offices and digital teams comes the imperative for leaders and managers to provide pathways for these organizations to succeed and work together effectively, in terms of embracing new ideas and scaling those that prove effective beyond a prototype or pilot. The availability of a large, one-time infusion of federal funds to support state and local services and programs through the American Rescue Plan Act, the Infrastructure law, and other recent laws provides State and local leaders with a unique opportunity to collaborate with their federal partners and promote innovation that improves the lives of their people. Data and innovation teams can help government be more efficient and effective in spending stimulus funds at the state and local level in the coming years.

In this new report, Jane Wiseman explores various ways that executives can leverage stimulus funding to incentivize success across multiple innovation and data roles, drive forward work from those roles into digital service development and delivery. Through close examination of multiple cases in the field, the author develops a framework with specific recommendations for how leaders can drive opportunities for innovators to complement each other to the benefit of public good, including key skills or characteristics that correlate to success.

This report is intended to help leaders of current government innovation groups, including chief data officers, chief digital officers, innovation team leaders, and similar groups, to learn from successful models that they can apply directly to their operations to be more effective. The report also provides lessons and recommendation for senior executives in government, such as a cabinet secretary, governor, county executive or mayor, to help them think through the possible models of effective practices to support the range of innovation roles, define success…(More)”.

Measuring Small Business Dynamics and Employment with Private-Sector Real-Time Data


Paper by André Kurmann, Étienne Lalé and Lien Ta: “The COVID-19 pandemic has led to an explosion of research using private-sector datasets to measure business dynamics and employment in real-time. Yet questions remain about the representativeness of these datasets and how to distinguish business openings and closings from sample churn – i.e., sample entry of already operating businesses and sample exits of businesses that continue operating. This paper proposes new methods to address these issues and applies them to the case of Homebase, a real-time dataset of mostly small service-sector sector businesses that has been used extensively in the literature to study the effects of the pandemic. We match the Homebase establishment records with information on business activity from Safegraph, Google, and Facebook to assess the representativeness of the data and to estimate the probability of business closings and openings among sample exits and entries. We then exploit the high frequency / geographic detail of the data to study whether small service-sector businesses have been hit harder by the pandemic than larger firms, and the extent to which the Paycheck Protection Program (PPP) helped small businesses keep their workforce employed. We find that our real-time estimates of small business dynamics and employment during the pandemic are remarkably representative and closely fit population counterparts from administrative data that have recently become available. Distinguishing business closings and openings from sample churn is critical for these results. We also find that while employment by small businesses contracted more severely in the beginning of the pandemic than employment of larger businesses, it also recovered more strongly thereafter. In turn, our estimates suggests that the rapid rollout of PPP loans significantly mitigated the negative employment effects of the pandemic. Business closings and openings are a key driver for both results, thus underlining the importance of properly correcting for sample churn…(More)”.

Nudging Consumers to Purchase More Sustainably


Article by Erez Yoeli: “Most consumers still don’t choose sustainable products when the option is available. Americans may claim to be willing to pay more for green energy, but while green energy is available in the majority of states — 35 out of 50 states or roughly 80% of American households as of 2018, at least — only 14% of households were even aware of the green option, and less than half of these households purchased it. Hybrids and electric vehicles are available nationwide, but still amount to just 10% of sales — 6.6% and 3.4%, respectively, according to S&P Global’s subscription services.

Now it may be that this virtue thinking-doing gap will eventually close. I hope so. But it will certainly need help, because in these situations there’s often an insidious behavioral dynamic at work that often stops stated good intentions from turning into actual good deeds…

Allow me to illustrate what I mean by “the plausible deniability effect” with an example from a now-classic behavioral economics study. Every year, around the holidays, Salvation Army volunteers collect donations for the needy outside supermarkets and other retail outlets. Researchers Justin Rao, Jim Andreoni, and Hanna Trachtmann teamed up with a Boston chapter of the Salvation Army to test ways of increasing donations.

Taking a supermarket that had two exit/entry points, the team randomly divided the volunteers into two groups. In one group, just one volunteer was assigned to stand in front of one door. For the other group, volunteers were stationed at both doors…(More)”.

Closing the Data Divide for a More Equitable U.S. Digital Economy


Report by Gillian Diebold: “In the United States, access to many public and private services, including those in the financial, educational, and health-care sectors, are intricately linked to data. But adequate data is not collected equitably from all Americans, creating a new challenge: the data divide, in which not everyone has enough high-quality data collected about them or their communities and therefore cannot benefit from data-driven innovation. This report provides an overview of the data divide in the United States and offers recommendations for how policymakers can address these inequalities…(More)”.

Sustaining Open Data as a Digital Common — Design principles for Common Pool Resources applied to Open Data Ecosystems


Paper by Johan Linåker, and Per Runeson: “Digital commons is an emerging phenomenon and of increasing importance, as we enter a digital society. Open data is one example that makes up a pivotal input and foundation for many of today’s digital services and applications. Ensuring sustainable provisioning and maintenance of the data, therefore, becomes even more important.

We aim to investigate how such provisioning and maintenance can be collaboratively performed in the community surrounding a common. Specifically, we look at Open Data Ecosystems (ODEs), a type of community of actors, openly sharing and evolving data on a technological platform.

We use Elinor Ostrom’s design principles for Common Pool Resources as a lens to systematically analyze the governance of earlier reported cases of ODEs using a theory-oriented software engineering framework.

We find that, while natural commons must regulate consumption, digital commons such as open data maintained by an ODE must stimulate both use and data provisioning. Governance needs to enable such stimulus while also ensuring that the collective action can still be coordinated and managed within the frame of available maintenance resources of a community. Subtractability is, in this sense, a concern regarding the resources required to maintain the quality and value of the data, rather than the availability of data. Further, we derive empirically-based recommended practices for ODEs based on the design principles by Ostrom for how to design a governance structure in a way that enables a sustainable and collaborative provisioning and maintenance of the data.

ODEs are expected to play a role in data provisioning which democratize the digital society and enables innovation from smaller commercial actors. Our empirically based guidelines intend to support this development…(More).

What is the value of data? A review of empirical methods


Policy brief by Diane Coyle and Annabel Manley: “The economy has been transformed by data in recent years. Data-driven firms made up seven of the global top 10 firms by stock market capitalisation in 2021; and across the OECD (Organisation for Economic Co-operation and Development) economies there has been a growing gap in terms of productivity and profitability between firms that use data intensively and the rest (e.g. Brynjolfsson et al 2019; Bajgar et al 2022; Coyle et al 2022). The widespread availability of data and analytics has also begun to extend into the public sector and policymaking, for example with ‘following the science’ – implying intense use of data – becoming a tagline for the handling of the COVID-19 pandemic in the UK and elsewhere.

It is therefore obvious that data has value in an economically meaningful sense. The sources of its value and characteristics of data as an economic asset are discussed at length in our earlier Value of Data report (Coyle et al 2020a). We concluded that there is potential value to the economy as a whole from having the ability to use data, and not just to the organisations that control specific data sets. This appreciation is increasingly reflected in many policy statements of data strategy and the broader debate about the governance of data (e.g. European Parliament 2022). The value of data is also explicitly and implicitly acknowledged by firms that sell data services, and investors who take dataset assets into account in stock market valuations or mergers and acquisitions.

However, despite the broad recognition of its value, and the need to develop appropriate policy frameworks, there is still no consensus method for empirically determining the value of data. Without this, the full potential will not be realised (Verhulst 2018). There are not even many examples of markets for data that would indicate a private valuation (although not the wider social value). Yet estimates of the value of data are needed to determine an appropriate level of investment, as well as a better understanding of how data can contribute value to the economy and how to govern the collection and use of different types of data.

This brief presents an overview of a range of alternative methods for data valuation, including those proposed in the existing literature. This includes some relatively widely used methods and others that are more specialist or preliminary…(More)”.

Designing Data Spaces: The Ecosystem Approach to Competitive Advantage


Open access book edited by Boris Otto, Michael ten Hompel, and Stefan Wrobel: “…provides a comprehensive view on data ecosystems and platform economics from methodical and technological foundations up to reports from practical implementations and applications in various industries.

To this end, the book is structured in four parts: Part I “Foundations and Contexts” provides a general overview about building, running, and governing data spaces and an introduction to the IDS and GAIA-X projects. Part II “Data Space Technologies” subsequently details various implementation aspects of IDS and GAIA-X, including eg data usage control, the usage of blockchain technologies, or semantic data integration and interoperability. Next, Part III describes various “Use Cases and Data Ecosystems” from various application areas such as agriculture, healthcare, industry, energy, and mobility. Part IV eventually offers an overview of several “Solutions and Applications”, eg including products and experiences from companies like Google, SAP, Huawei, T-Systems, Innopay and many more.

Overall, the book provides professionals in industry with an encompassing overview of the technological and economic aspects of data spaces, based on the International Data Spaces and Gaia-X initiatives. It presents implementations and business cases and gives an outlook to future developments. In doing so, it aims at proliferating the vision of a social data market economy based on data spaces which embrace trust and data sovereignty…(More)”.

Meet the new GDP prototype that tracks inequality


Article by Greg Rosalsky: “…Nearly a century after Kuznets pioneered the use of GDP, economists Thomas Blanchet, Emmanuel Saez, and Gabriel Zucman are trying to revolutionize it. In a new paper titled “Real-Time Inequality,” the economists imagine a new kind of GDP, one that isn’t merely a single number telling us about total economic growth, but a collection of numbers telling us where the gains from this growth are flowing. They already have a working prototype that they’ve published online, and it can provide some important insights about our economy right now…

Gabriel Zucman is an economist at UC Berkeley and the director of the James M. and Cathleen D. Stone Center on Wealth and Income Inequality. He has been working to transform government economic statistics — also called “national accounts” — for almost a decade. He says the national accounts offer the public valuable insights about economic growth. However, Zucman says, “The big problem is these data do not tell you who is benefiting from economic growth.”

America, of course, already has tons of data on inequality. The problem, Zucman says, is it usually takes a year or two for this data to be updated. “It’s not enough to come in two years after the policy battle, and say, ‘Look, this is what happened to inequality,'” Zucman says. “That’s too late.”

Their new project is an effort to fix this. Cobbling together data from a variety of official sources, Zucman and his colleagues have pioneered a method to compute in a more timely fashion how different income groups — like the working class and the middle class — are doing economically. They hope this prototype will inspire the federal government to follow suit and soon “produce numbers about how income is growing for each social group at the exact time when the Bureau of Economic Analysis releases its official GDP growth numbers.”

Zucman envisions a future where this data could inform and shape policy decisions. When considering policies like sending stimulus checks or providing tax relief, Zucman says, policymakers and voters need to know things like “which groups need more support, or whether the government may be actually overshooting, which might lead to inflation.”…(More)”.

Measuring sustainable tourism with online platform data


Paper by Felix J. Hoffmann, Fabian Braesemann & Timm Teubner: “Sustainability in tourism is a topic of global relevance, finding multiple mentions in the United Nations Sustainable Development Goals. The complex task of balancing tourism’s economic, environmental, and social effects requires detailed and up-to-date data. This paper investigates whether online platform data can be employed as an alternative data source in sustainable tourism statistics. Using a web-scraped dataset from a large online tourism platform, a sustainability label for accommodations can be predicted reasonably well with machine learning techniques. The algorithmic prediction of accommodations’ sustainability using online data can provide a cost-effective and accurate measure that allows to track developments of tourism sustainability across the globe with high spatial and temporal granularity…(More)”.

Is GDP Becoming Obsolete? The “Beyond GDP” Debate


Paper by Charles R. Hulten & Leonard I. Nakamura: “GDP is a closely watched indicator of the current health of the economy and an important tool of economic policy. It has been called one of the great inventions of the 20th Century. It is not, however, a persuasive indicator of individual wellbeing or economic progress. There have been calls to refocus or replace GDP with a metric that better reflects the welfare dimension. In response, the U.S. agency responsible for the GDP accounts recently launched a “GDP and Beyond” program. This is by no means an easy undertaking, given the subjective and idiosyncratic nature of much of individual wellbeing. This paper joins the Beyond GDP effort by extending the standard utility maximization model of economic theory, using an expenditure function approach to include those non-GDP sources of wellbeing for which a monetary value can be established. We term our new measure expanded GDP (EGDP). A welfare-adjusted stock of wealth is also derived using the same general approach used to obtain EGDP. This stock is useful for issues involving the sustainability of wellbeing over time. One of the implications of this dichotomy is that conventional cost-based wealth may increase over a period of time while welfare-corrected wealth may show a decrease (due, for example, to strongly negative environmental externalities)…(More)”