Geospatial Data Market Study


Study by Frontier Economics: “Frontier Economics was commissioned by the Geospatial Commission to carry out a detailed economic study of the size, features and characteristics of the UK geospatial data market. The Geospatial Commission was established within the Cabinet Office in 2018, as an independent, expert committee responsible for setting the UK’s Geospatial Strategy and coordinating public sector geospatial activity. The Geospatial Commission’s aim is to unlock the significant economic, social and environmental opportunities offered by location data. The UK’s Geospatial Strategy (2020) sets out how the UK can unlock the full power of location data and take advantage of the significant economic, social and environmental opportunities offered by location data….

Like many other forms of data, the value of geospatial data is not limited to the data creator or data user. Value from using geospatial data can be subdivided into several different categories, based on who the value accrues to:

Direct use value: where value accrues to users of geospatial data. This could include government using geospatial data to better manage public assets like roadways.

Indirect use value: where value is also derived by indirect beneficiaries who interact with direct users. This could include users of the public assets who benefit from better public service provision.

Spillover use value: value that accrues to others who are not a direct data user or indirect beneficiary. This could, for example, include lower levels of emissions due to improvement management of the road network by government. The benefits of lower emissions are felt by all of society even those who do not use the road network.

As the value from geospatial data does not always accrue to the direct user of the data, there is a risk of underinvestment in geospatial technology and services. Our £6 billion estimate of turnover for a subset of geospatial firms in 2018 does not take account of these wider economic benefits that “spill over” across the UK economy, and generate additional value. As such, the value that geospatial data delivers is likely to be significantly higher than we have estimated and is therefore an area for potential future investment….(More)”.

Public value and platform governance


UCL Institute for Innovation and Public Purpose (IIPP) Working Paper: “The market size and strength of the major digital platform companies has invited international concern about how such firms should best be regulated to serve the interests of wider society, with a particular emphasis on the need for new anti-trust legislation. Using a normative innovation systems approach, this paper investigates how current anti-trust models may insufficiently address the value-extracting features of existing data-intensive and platform-oriented industry behaviour and business models. To do so, we employ the concept of economic rents to investigate how digital platforms create and extract value. Two forms of rent are elaborated: ‘network monopoly rents’ and ‘algorithmic rents’. By identifying such rents more precisely, policymakers and researchers can better direct regulatory investigations, as well as broader industrial and innovation policy approaches, to shape the features of platform-driven digital markets…(More)”.

The forecasting fallacy


Essay by Alex Murrell: “Marketers are prone to a prediction.

You’ll find them in the annual tirade of trend decks. In the PowerPoint projections of self-proclaimed prophets. In the feeds of forecasters and futurists. They crop up on every conference stage. They make their mark on every marketing magazine. And they work their way into every white paper.

To understand the extent of our forecasting fascination, I analysed the websites of three management consultancies looking for predictions with time frames ranging from 2025 to 2050. Whilst one prediction may be published multiple times, the size of the numbers still shocked me. Deloitte’s site makes 6904 predictions. McKinsey & Company make 4296. And Boston Consulting Group, 3679.

In total, these three companies’ websites include just shy of 15,000 predictions stretching out over the next 30 years.

But it doesn’t stop there.

My analysis finished in the year 2050 not because the predictions came to an end but because my enthusiasm did.

Search the sites and you’ll find forecasts stretching all the way to the year 2100. We’re still finding our feet in this century but some, it seems, already understand the next.

I believe the vast majority of these to be not forecasts but fantasies. Snake oil dressed up as science. Fiction masquerading as fact.

This article assesses how predictions have performed in five fields. It argues that poor projections have propagated throughout our society and proliferated throughout our industry. It argues that our fixation with forecasts is fundamentally flawed.

So instead of focussing on the future, let’s take a moment to look at the predictions of the past. Let’s see how our projections panned out….

Viewed through the lens of Tetlock, it becomes clear that the 15,000 predictions with which I began this article are not forecasts but fantasies.

The projections look precise. They sound scientific. But these forecasts are nothing more than delusions with decimal places. Snake oil dressed up as statistics. Fiction masquerading as fact. They provide a feeling of certainty but they deliver anything but.

In his 1998 book The Fortune Sellers, the business writer William A. Sherden quantified our consensual hallucination: 

“Each year the prediction industry showers us with $200 billion in (mostly erroneous) information. The forecasting track records for all types of experts are universally poor, whether we consider scientifically oriented professionals, such as economists, demographers, meteorologists, and seismologists, or psychic and astrological forecasters whose names are household words.” 

The comparison between professional predictors and fortune tellers is apt.

From tarot cards to tea leaves, palmistry to pyromancy, clear visions of cloudy futures have always been sold to susceptible audiences. 

Today, marketers are one such audience.

It’s time we opened our eyes….(More)”.

Using behavioral insights to make the most of emergency social protection cash transfers


Article by Laura Rawlings, Jessica Jean-Francois and Catherine MacLeod: “In response to the COVID-19 pandemic, countries across the globe have been adapting social assistance policies to support their populations. In fact, since March 2020, 139 countries and territories have planned, implemented, or adapted cash transfers to support their citizens. Cash transfers specifically make up about half of the social protection programs implemented to address the pandemic. Now more than ever, it’s crucial that such programs are designed to maximize impacts. Behavioral insights can be mobilized as a cost-effective way to help beneficiaries make the most out of the available support. The World Bank and ideas42 partnership on behavioral designs for cash transfer programs is helping countries achieve this goal.

Cash transfers are a key response instrument in the social protection toolkit—and for good reason. Cash transfers have been shown to generate a wide variety of positive benefits, from helping families invest in their children to promoting gender equality. However, we know from our previous work that in order to make the most out of cash transfers, recipients of any program (already facing challenging circumstances that compete for their attention) must undertake complex decisions and actions with their cash. These challenges are only magnified by the global pandemic. COVID-19 has wrought increased uncertainty around future employment and income, which makes calculations and planning to use cash transfer benefits all the more complex.

To help practitioners design programs that account for the complex thought processes and potential barriers recipients face, we mapped out their journey to effectively spend emergency social protection cash transfers. We also created simple, actionable guidance for program designers to put to use in maximizing their programs to help recipients use their cash transfer benefit to most effectively support families and reduce mid- to long-term financial volatility. 

For example, the first step is helping recipients understand what the transfer is for. For recipients who have not yet been impacted by financial instability, or indeed have never encountered a cash transfer before, such funds might seem like a gift or bonus, and recipients may spend it accordingly. Providing clear, simple framing or labelling the transfer may signal to recipients that they should use the cash not only for immediate needs, but also in ways that can help them protect investments in their family members’ human capital and jumpstart their livelihood after the crisis wanes….(More)”.

The economics of Business to Government data sharing


Paper by Bertin Martens and Nestor Duch Brown: “Data and information are fundamental pieces for effective evidence-based policy making and provision of public services. In recent years, some private firms have been collecting large amounts of data, which, were they available to governments, could greatly improve their capacity to take better policy decisions and to increase social welfare. Business-to-Government (B2G) data sharing can result in substantial benefits for society. It can save costs to governments by allowing them to benefit from the use of data collected by businesses without having to collect the same data again. Moreover, it can support the production of new and innovative outputs based on the shared data by different users. Finally, the data available to government may give only an incomplete or even biased picture, while aggregating complementary datasets shared by different parties (including businesses) may result in improved policies with strong social welfare benefits.


The examples assembled by the High Level Expert Group on B2G data sharing show that most of the current B2G data transactions remain one-off experimental pilot projects that do not seem to be sustainable over time. Overall, the volume of B2G operations still seems to be relatively small and clearly sub-optimal from a social welfare perspective. The market does not seem to scale compared to the economic potential for welfare gains in society. There are likely to be significant potential economic benefits from additional B2G data sharing operations. These could be enabled by measures that would seek to improve their governance conditions to contribute to increase the overall number of transactions. To design such measures, it is important to understand the nature of the current barriers for B2G data sharing operations. In this paper, we focus on the more important barriers from an economic perspective: (a) monopolistic data markets, (b) high transaction costs and perceived risks in data sharing and (c) a lack of incentives for private firms to contribute to the production of public benefits. The following reflections are mainly conceptual, since there is currently little quantitative empirical evidence on the different aspects of B2G transactions.

  • Monopolistic data markets. Some firms -like big tech companies for instance- may be in a privileged position as the exclusive providers of the type of data that a public body seeks to access. This position enables the firms to charge a high price for the data beyond a reasonable rate of return on costs. While a monopolistic market is still a functioning market, the resulting price may lead to some governments not being able or willing to purchase the data and therefore may cause social welfare losses. Nonetheless, monopolistic pricing may still be justified from an innovation perspective: it strengthens incentives to invest in more and better data collection systems and thereby increases the supply of data in the long run. In some cases, the data seller may be in a position to price-discriminate between commercial buyers and a public body, charging a lower price to the latter since the data would not be used for commercial purposes.
  • High transaction costs and perceived risks. An important barrier for data sharing comes from the ex-ante costs related to finding a suitable data sharing partner, negotiating a contractual arrangement, re-formatting and cleaning the data, among others. Potentially interested public bodies may not be aware of available datasets or may not be in a position to handle them or understand their advantages and disadvantages. There may also be ex-post risks related to uncertainties in the quality and/or usefulness of the data, the technical implementation of the data sharing deal, ensuring compliance with the agreed conditions, the risk of data leaks to unauthorized third-parties and exposure of personal and confidential data.
  • Lack of incentives. Firms may be reluctant to share data with governments because it might have a negative impact on them. This could be due to suspicions that the data delivered might be used to implement market regulations and to enforce competition rules that could negatively affect firms’ profits. Moreover, if firms share data with government under preferential conditions, they may have difficulties justifying the foregone profit to shareholders, since the benefits generated by better policies or public services fuelled by the private data will occur to society as a whole and are often difficult to express in monetary terms. Finally, firms might be afraid of entering into a competitive disadvantage if they provide data to public bodies – perhaps under preferential conditions – and their competitors do not.

Several mechanisms could be designed to solve the barriers that may be holding back B2G data sharing initiatives. One would be to provide stronger incentives for the data supplier firm to engage in this type of transactions. These incentives can be direct, i.e., monetary, or indirect, i.e., reputational (e.g. as part of corporate social responsibility programmes). Another way would be to ascertain the data transfer by making the transaction mandatory, with a fair cost compensation. An intermediate way would be based on solutions that seek to facilitate voluntary B2G operations without mandating them, for example by reducing the transaction costs and perceived risks for the provider data supplier, e.g. by setting up trusted data intermediary platforms, or appropriate contractual provisions. A possible EU governance framework for B2G data sharing operations could cover these options….(More)”.

The Potential of Open Digital Ecosystems


About: “Omidyar Network India (ONI), in partnership with Boston Consulting Group (BCG), has undertaken a study to reimagine digital platforms for the public good, with the aim build a shared narrative around digital platforms and develop a holistic roadmap to foster their systematic adoption.

This study has especially benefited from collaboration with the Ministry of Electronics and Information Technology (MeitY), Government of India. It builds on the thinking presented in the public consultation whitepaper on ‘Strategy for National Open Digital Ecosystems (NODEs)’ published by MeitY in February 2020, to which ONI and BCG have contributed.

This website outlines the key findings of the study and introduces a new paradigm, i.e. ODEs, which recognizes the importance of a strong governance framework as well as the community of stakeholders that make them effective….(More)”.

Algorithmic Colonisation of Africa Read


Abeba Birhane at The Elephant: “The African equivalents of Silicon Valley’s tech start-ups can be found in every possible sphere of life around all corners of the continent—in “Sheba Valley” in Addis Abeba, “Yabacon Valley” in Lagos, and “Silicon Savannah” in Nairobi, to name a few—all pursuing “cutting-edge innovations” in sectors like banking, finance, healthcare, and education. They are headed by technologists and those in finance from both within and outside the continent who seemingly want to “solve” society’s problems, using data and AI to provide quick “solutions”. As a result, the attempt to “solve” social problems with technology is exactly where problems arise. Complex cultural, moral, and political problems that are inherently embedded in history and context are reduced to problems that can be measured and quantified—matters that can be “fixed” with the latest algorithm.

As dynamic and interactive human activities and processes are automated, they are inherently simplified to the engineers’ and tech corporations’ subjective notions of what they mean. The reduction of complex social problems to a matter that can be “solved” by technology also treats people as passive objects for manipulation. Humans, however, far from being passive objects, are active meaning-seekers embedded in dynamic social, cultural, and historical backgrounds.

The discourse around “data mining”, “abundance of data”, and “data-rich continent” shows the extent to which the individual behind each data point is disregarded. This muting of the individual—a person with fears, emotions, dreams, and hopes—is symptomatic of how little attention is given to matters such as people’s well-being and consent, which should be the primary concerns if the goal is indeed to “help” those in need. Furthermore, this discourse of “mining” people for data is reminiscent of the coloniser’s attitude that declares humans as raw material free for the taking. Complex cultural, moral, and political problems that are inherently embedded in history and context are reduced to problems that can be measured and quantified Data is necessarily always about something and never about an abstract entity.

The collection, analysis, and manipulation of data potentially entails monitoring, tracking, and surveilling people. This necessarily impacts people directly or indirectly whether it manifests as change in their insurance premiums or refusal of services. The erasure of the person behind each data point makes it easy to “manipulate behavior” or “nudge” users, often towards profitable outcomes for companies. Considerations around the wellbeing and welfare of the individual user, the long-term social impacts, and the unintended consequences of these systems on society’s most vulnerable are pushed aside, if they enter the equation at all. For companies that develop and deploy AI, at the top of the agenda is the collection of more data to develop profitable AI systems rather than the welfare of individual people or communities. This is most evident in the FinTech sector, one of the prominent digital markets in Africa. People’s digital footprints, from their interactions with others to how much they spend on their mobile top ups, are continually surveyed and monitored to form data for making loan assessments. Smartphone data from browsing history, likes, and locations is recorded forming the basis for a borrower’s creditworthiness.

Artificial Intelligence technologies that aid decision-making in the social sphere are, for the most part, developed and implemented by the private sector whose primary aim is to maximise profit. Protecting individual privacy rights and cultivating a fair society is therefore the least of their concerns, especially if such practice gets in the way of “mining” data, building predictive models, and pushing products to customers. As decision-making of social outcomes is handed over to predictive systems developed by profit-driven corporates, not only are we allowing our social concerns to be dictated by corporate incentives, we are also allowing moral questions to be dictated by corporate interest.

“Digital nudges”, behaviour modifications developed to suit commercial interests, are a prime example. As “nudging” mechanisms become the norm for “correcting” individuals’ behaviour, eating habits, or exercise routines, those developing predictive models are bestowed with the power to decide what “correct” is. In the process, individuals that do not fit our stereotypical ideas of a “fit body”, “good health”, and “good eating habits” end up being punished, outcast, and pushed further to the margins. When these models are imported as state-of-the-art technology that will save money and “leapfrog” the continent into development, Western values and ideals are enforced, either deliberately or intentionally….(More)”.

Mapping socioeconomic indicators using social media advertising data


Paper by Ingmar Weber et al: “The United Nations Sustainable Development Goals (SDGs) are a global consensus on the world’s most pressing challenges. They come with a set of 232 indicators against which countries should regularly monitor their progress, ensuring that everyone is represented in up-to-date data that can be used to make decisions to improve people’s lives. However, existing data sources to measure progress on the SDGs are often outdated or lacking appropriate disaggregation. We evaluate the value that anonymous, publicly accessible advertising data from Facebook can provide in mapping socio-economic development in two low and middle income countries, the Philippines and India. Concretely, we show that audience estimates of how many Facebook users in a given location use particular device types, such as Android vs. iOS devices, or particular connection types, such as 2G vs. 4G, provide strong signals for modeling regional variation in the Wealth Index (WI), derived from the Demographic and Health Survey (DHS). We further show that, surprisingly, the predictive power of these digital connectivity features is roughly equal at both the high and low ends of the WI spectrum. Finally we show how such data can be used to create gender-disaggregated predictions, but that these predictions only appear plausible in contexts with gender equal Facebook usage, such as the Philippines, but not in contexts with large gender Facebook gaps, such as India….(More)”.

How open data could tame Big Tech’s power and avoid a breakup


Patrick Leblond at The Conversation: “…Traditional antitrust approaches such as breaking up Big Tech firms and preventing potential competitor acquisitions are never-ending processes. Even if you break them up and block their ability to acquire other, smaller tech firms, Big Tech will start growing again because of network effects and their data advantage.

And how do we know when a tech firm is big enough to ensure competitive markets? What are the size or scope thresholds for breaking up firms or blocking mergers and acquisitions?

A small startup acquired for millions of dollars can be worth billions of dollars for a Big Tech acquirer once integrated in its ecosystem. A series of small acquisitions can result in a dominant position in one area of the digital economy. Knowing this, competition/antitrust authorities would potentially have to examine every tech transaction, however small.

Not only would this be administratively costly or burdensome on resources, but it would also be difficult for government officials to assess with some precision (and therefore legitimacy), the likely future economic impact of an acquisition in a rapidly evolving technological environment.

Open data access, level the playing field

Given that mass data collection is at the core of Big Tech’s power as gatekeepers to customers, a key solution is to open up data access for other firms so that they can compete better.

Anonymized data (to protect an individual’s privacy rights) about people’s behaviour, interests, views, etc., should be made available for free to anyone wanting to pursue a commercial or non-commercial endeavour. Data about a firm’s operations or performance would, however, remain private.

Using an analogy from the finance world, Big Tech firms act as insider traders. Stock market insiders often possess insider (or private) information about companies that the public does not have. Such individuals then have an incentive to profit by buying or selling shares in those companies before the public becomes aware of the information.

Big Tech’s incentives are no different than stock market insiders. They trade on exclusively available private information (data) to generate extraordinary profits.

Continuing the finance analogy, financial securities regulators forbid the use of inside or non-publicly available information for personal benefit. Individuals found to illegally use such information are punished with jail time and fines.

They also require companies to publicly report relevant information that affects or could significantly affect their performance. Finally, they oblige insiders to publicly report when they buy and sell shares in a company in which they have access to privileged information.

Transposing stock market insider trading regulation to Big Tech implies that data access and use should be monitored under an independent regulatory body — call it a Data Market Authority. Such a body would be responsible for setting and enforcing principles, rules and standards of behaviour among individuals and organizations in the data-driven economy.

For example, a Data Market Authority would require firms to publicly report how they acquire and use personal data. It would prohibit personal data hoarding by ensuring that data is easily portable from one platform, network or marketplace to another. It would also prohibit the buying and selling of personal data as well as protect individuals’ privacy by imposing penalties on firms and individuals in cases of non-compliance.

Data openly and freely available under a strict regulatory environment would likely be a better way to tame Big Tech’s power than breaking them up and having antitrust authorities approving every acquisition that they wish to make….(More)”.

Resetting the state for the post-covid digital age


Blog by Carlos Santiso: “The COVID-19 crisis is putting our global digital resilience to the test. It has revealed the importance of a country’s digital infrastructure as the backbone of the economy, not just as an enabler of the tech economy. Digitally advanced governments, such as Estonia, have been able to put their entire bureaucracies in remote mode in a matter of days, without major disruption. And some early evidence even suggests that their productivity increased during lockdown.

With the crisis, the costs of not going digital have largely surpassed the risks of doing so. Countries and cities lagging behind have realised the necessity to boost their digital resilience and accelerate their digital transformation. Spain, for example, adopted an ambitious plan to inject 70 billion euro into in its digital transformation over the next five years, with a Digital Spain 2025 agenda comprising 10 priorities and 48 measures. In the case of Brazil, the country was already taking steps towards the digital transformation of its public sector before the COVID-19 crisis hit. The crisis is accelerating this transformation.

The great accelerator

Long before the crisis hit, the data-driven digital revolution has been challenging governments to modernise and become more agile, open and responsive. Progress has nevertheless been uneven, hindered by a variety of factors, from political resistance to budget constraints. Going digital requires the sort of whole-of government reforms that need political muscle and long-term vision to break-up traditional data silos within bureaucracies, jealous to preserve their power. In bureaucracies, information is power. Now, information has become ubiquitous and governing data, a critical challenge.

Cutting red tape will be central to the recovery. Many governments are fast-tracking regulatory simplification and administrative streamlining to reboot hard-hit economic sectors. Digitalisation is resetting the relationship between states and citizens, a Copernican revolution for our rule-based bureaucracies….(More)“.