The Future of Compute


Independent Review by a UK Expert Panel: “…Compute is a material part of modern life. It is among the critical technologies lying behind innovation, economic growth and scientific discoveries. Compute improves our everyday lives. It underpins all the tools, services and information we hold on our handheld devices – from search engines and social media, to streaming services and accurate weather forecasts. This technology may be invisible to the public, but life today would be very different without it.

Sectors across the UK economy, both new and old, are increasingly reliant upon compute. By leveraging the capability that compute provides, businesses of all sizes can extract value from the enormous quantity of data created every day; reduce the cost and time required for research and development (R&D); improve product design; accelerate decision making processes; and increase overall efficiency. Compute also enables advancements in transformative technologies, such as AI, which themselves lead to the creation of value and innovation across the economy. This all translates into higher productivity and profitability for businesses and robust economic growth for the UK as a whole.

Compute powers modelling, simulations, data analysis and scenario planning, and thereby enables researchers to develop new drugs; find new energy sources; discover new materials; mitigate the effects of climate change; and model the spread of pandemics. Compute is required to tackle many of today’s global challenges and brings invaluable benefits to our society.

Compute’s effects on society and the economy have already been and, crucially, will continue to be transformative. The scale of compute capabilities keeps accelerating at pace. The performance of the world’s fastest compute has grown by a factor of 626 since 2010. The compute requirements of the largest machine learning models has grown 10 billion times over the last 10 years. We expect compute demand to significantly grow as compute capability continues to increase. Technology today operates very differently to 10 years ago and, in a decade’s time, it will have changed once again.

Yet, despite compute’s value to the economy and society, the UK lacks a long-term vision for compute…(More)”.

A model for a participative approach to digital competition regulation


Policy Brief by Christophe Carugati: “Digital competition regulations often put in place participative approaches to ensure competition in digital markets. The participative approach aims to involve regulated firms, stakeholders and regulators in the design of compliance measures. The approach is particularly relevant in complex and fast-evolving digital markets, where whole industries often depend on the behaviours of the regulated firms. The participative approach enables stakeholders and regulated firms to design compliance measures that are optimal for all because they ensure legal certainty for regulated firms, save time for regulators and take into account the views of stakeholders.

However, the participative approach is subject to regulatory capture. The regulated firms and stakeholders might try to promote their interests to the regulator. This could result in endless discussions at best, and the adoption of inappropriate solutions following intense lobbying at worst.

A governance model is necessary to ensure that the participative approach works without risks of regulatory capture. The model should define clearly each participant’s role, duties and rights. There should be: 1) equal and transparent access of all stakeholders to the dialogue; 2) the presentation of tangible and evidence-based solutions from stakeholders and regulated firms; 3) public decisions from the regulator that contain assessments of the proposed solutions, with guidance to clarify rules; and 4) compliance measures proposed by the regulated firm in line with the guidance. The model should provide an assessment framework for the proposed solutions to identify the most effective. The assessment should rely on the principle of proportionality to assess whether the proposed compliance measure is proportionate, to ensure the effectiveness of the regulation. Finally, the model should safeguard against regulatory capture thanks to transparency rules and external monitoring…(More)”

Predicting Socio-Economic Well-being Using Mobile Apps Data: A Case Study of France


Paper by Rahul Goel, Angelo Furno, and Rajesh Sharma: “Socio-economic indicators provide context for assessing a country’s overall condition. These indicators contain information about education, gender, poverty, employment, and other factors. Therefore, reliable and accurate information is critical for social research and government policing. Most data sources available today, such as censuses, have sparse population coverage or are updated infrequently. Nonetheless, alternative data sources, such as call data records (CDR) and mobile app usage, can serve as cost-effective and up-to-date sources for identifying socio-economic indicators.
This work investigates mobile app data to predict socio-economic features. We present a large-scale study using data that captures the traffic of thousands of mobile applications by approximately 30 million users distributed over 550,000 km square and served by over 25,000 base stations. The dataset covers the whole France territory and spans more than 2.5 months, starting from 16th March 2019 to 6th June 2019. Using the app usage patterns, our best model can estimate socio-economic indicators (attaining an R-squared score upto 0.66). Furthermore, using models’ explainability, we discover that mobile app usage patterns have the potential to reveal socio-economic disparities in IRIS. Insights of this study provide several avenues for future interventions, including users’ temporal network analysis and exploration of alternative data sources…(More)”.

Data Free Flow with Trust: Overcoming Barriers to Cross-Border Data Flows


Briefing Paper by the WEF: “The movement of data across country borders is essential to the global economy. When data flows across borders, it is possible to deliver more to more people and produce more benefits for people and planet. This briefing paper highlights the importance of such data flows and urges global leaders in the public and private sectors to take collective action to work towards a shared understanding of them with a view to implementing “Data Free Flow with Trust” (DFFT) – an umbrella concept for facilitating trust-based data exchanges. This paper reviews the current challenges facing DFFT, take stock of progress made so far, offer direction for policy mechanisms and concrete tools for businesses and, more importantly, promote global discussions about how to realize DFFT from the perspectives of policy and business…(More)”.

Measuring the value of data and data flows


Report by the OECD: “Data have become a key input into the production of many goods and services. But just how important? What is the value of data – their contribution to economic growth and well-being? This report discusses different approaches to data valuation, their advantages and shortcomings and their applicability in different contexts. It argues that the value of data depends to a large extent on the data governance framework determining how they can be created, shared and used. In addition, the report provides estimates of the value of data and data flows. Its focus is on the monetary valuation of data produced by private economic actors and their recording in economic statistics. Finally, the report puts forward a draft measurement agenda for the future…(More)”.

Digital Transition Framework: An action plan for public-private collaboration


WEF Report: “The accelerated digital transition is unlocking economic and technology innovation, boosting growth, and enabling new forms of social engagement across the globe. Yet, the benefits from digital transformation have not been fully realized; compounded with macroeconomic and geopolitical headwinds that are forcing public-private leaders to make digital technology investment trade-offs. The Digital Transition Framework: An Action Plan for Public-Private Collaboration sets out concrete actions and leading examples to support governments achieve their digital transition goals in the face of uncertainty…(More)”.

GDP is getting a makeover — what it means for economies, health and the planet


Article by Ehsan Masood: “The numbers are heading in the wrong direction. If the world continues on its current track, it will fall well short of achieving almost all of the 17 Sustainable Development Goals (SDGs) that the United Nations set to protect the environment and end poverty and inequality by 2030.

The projected grade for:

Eliminating hunger: F.

Ensuring healthy lives for all: F.

Protecting and sustainably using ocean resources: F.

The trends were there before 2020, but then problems increased with the COVID-19 pandemic, war in Ukraine and the worsening effects of climate change. The world is in “a new uncertainty complex”, says economist Pedro Conceição, lead author of the United Nations Human Development Report.

One measure of this is the drastic change in the Human Development Index (HDI), which combines educational outcomes, income and life expectancy into a single composite indicator. After 2019, the index has fallen for two successive years for the first time since its creation in 1990. “I don’t think this is a one-off, or a blip. I think this could be a new reality,” Conceição says.

UN secretary-general António Guterres is worried. “We need an urgent rescue effort for the SDGs,” he wrote in the foreword to the latest progress report, published in July. Over the past year, Guterres and the heads of big UN agencies, such as the Statistics Division and the UN Development Programme, have been assessing what’s gone wrong and what needs to be done. They’re converging on the idea that it’s time to stop using gross domestic product (GDP) as the world’s main measure of prosperity, and to complement it with a dashboard of indicators, possibly ones linked to the SDGs. If this happens, it would be the biggest shift in how economies are measured since nations first started using GDP in 1953, almost 70 years ago.

Guterres’s is the latest in a crescendo of voices calling for GDP to be dropped as the world’s primary go-to indicator, and for a dashboard of metrics instead. In 2008, then French president Nicolas Sarkozy endorsed such a call from a team of economists, including Nobel laureates Amartya Sen and Joseph Stiglitz.

And in August, the White House announced a 15-year plan to develop a new summary statistic that would show how changes to natural assets — the natural wealth on which economies depend — affect GDP. The idea, according to the project’s main architect, economist Eli Fenichel at the White House Office of Science and Technology Policy, is to help society to determine whether today’s consumption is being accomplished without compromising the future opportunities that nature provides. “GDP only gives a partial and — for many common uses — an incomplete, picture of economic progress,” Fenichel says.

The fact that Guterres has made this a priority, amid so many major crises, is a sign that “going beyond GDP has been picked up at the highest level”, says Stefan Schweinfest, the director of the UN Statistics Division, based in New York City…(More)”.

Wicked Problems Might Inspire Greater Data Sharing


Paper by Susan Ariel Aaronson: “In 2021, the United Nations Development Program issued a plea in their 2021 Digital Economy Report. “ Global data-sharing can help address major global development challenges such as poverty, health, hunger and climate change. …Without global cooperation on data and information, research to develop the vaccine and actions to tackle the impact of the pandemic would have been a much more difficult task. Thus, in the same way as some data can be public goods, there is a case for some data to be considered as global public goods, which need to be addressed and provided through global governance.” (UNDP: 2021, 178). Global public goods are goods and services with benefits and costs that potentially extend to all countries, people, and generations. Global data sharing can also help solve what scholars call wicked problems—problems so complex that they require innovative, cost effective and global mitigating strategies. Wicked problems are problems that no one knows how to solve without
creating further problems. Hence, policymakers must find ways to encourage greater data sharing among entities that hold large troves of various types of data, while protecting that data from theft, manipulation etc. Many factors impede global data sharing for public good purposes; this analysis focuses on two.
First, policymakers generally don’t think about data as a global public good; they view data as a commercial asset that they should nurture and control. While they may understand that data can serve the public interest, they are more concerned with using data to serve their country’s economic interest. Secondly, many leaders of civil society and business see the data they have collected as proprietary data. So far many leaders of private entities with troves of data are not convinced that their organization will benefit from such sharing. At the same time, companies voluntarily share some data for social good purposes.

However, data cannot meet its public good purpose if data is not shared among societal entities. Moreover, if data as a sovereign asset, policymakers are unlikely to encourage data sharing across borders oriented towards addressing shared problems. Consequently, society will be less able to use data as both a commercial asset and as a resource to enhance human welfare. As the Bennet Institute and ODI have argued, “value comes from data being brought together, and that requires organizations to let others use the data they hold.” But that also means the entities that collected the data may not accrue all of the benefits from that data (Bennett Institute and ODI: 2020a: 4). In short, private entities are not sufficiently incentivized to share data in the global public good…(More)”.

Accelerating Government Innovation With Leadership and Stimulus Funding


Paper by Jane Wiseman: “With the evolving maturity of innovation offices and digital teams comes the imperative for leaders and managers to provide pathways for these organizations to succeed and work together effectively, in terms of embracing new ideas and scaling those that prove effective beyond a prototype or pilot. The availability of a large, one-time infusion of federal funds to support state and local services and programs through the American Rescue Plan Act, the Infrastructure law, and other recent laws provides State and local leaders with a unique opportunity to collaborate with their federal partners and promote innovation that improves the lives of their people. Data and innovation teams can help government be more efficient and effective in spending stimulus funds at the state and local level in the coming years.

In this new report, Jane Wiseman explores various ways that executives can leverage stimulus funding to incentivize success across multiple innovation and data roles, drive forward work from those roles into digital service development and delivery. Through close examination of multiple cases in the field, the author develops a framework with specific recommendations for how leaders can drive opportunities for innovators to complement each other to the benefit of public good, including key skills or characteristics that correlate to success.

This report is intended to help leaders of current government innovation groups, including chief data officers, chief digital officers, innovation team leaders, and similar groups, to learn from successful models that they can apply directly to their operations to be more effective. The report also provides lessons and recommendation for senior executives in government, such as a cabinet secretary, governor, county executive or mayor, to help them think through the possible models of effective practices to support the range of innovation roles, define success…(More)”.

Measuring Small Business Dynamics and Employment with Private-Sector Real-Time Data


Paper by André Kurmann, Étienne Lalé and Lien Ta: “The COVID-19 pandemic has led to an explosion of research using private-sector datasets to measure business dynamics and employment in real-time. Yet questions remain about the representativeness of these datasets and how to distinguish business openings and closings from sample churn – i.e., sample entry of already operating businesses and sample exits of businesses that continue operating. This paper proposes new methods to address these issues and applies them to the case of Homebase, a real-time dataset of mostly small service-sector sector businesses that has been used extensively in the literature to study the effects of the pandemic. We match the Homebase establishment records with information on business activity from Safegraph, Google, and Facebook to assess the representativeness of the data and to estimate the probability of business closings and openings among sample exits and entries. We then exploit the high frequency / geographic detail of the data to study whether small service-sector businesses have been hit harder by the pandemic than larger firms, and the extent to which the Paycheck Protection Program (PPP) helped small businesses keep their workforce employed. We find that our real-time estimates of small business dynamics and employment during the pandemic are remarkably representative and closely fit population counterparts from administrative data that have recently become available. Distinguishing business closings and openings from sample churn is critical for these results. We also find that while employment by small businesses contracted more severely in the beginning of the pandemic than employment of larger businesses, it also recovered more strongly thereafter. In turn, our estimates suggests that the rapid rollout of PPP loans significantly mitigated the negative employment effects of the pandemic. Business closings and openings are a key driver for both results, thus underlining the importance of properly correcting for sample churn…(More)”.