Trust, Control, and the Economics of Governance


Book by Philipp Herold: “In today’s world, we cooperate across legal and cultural systems in order to create value. However, this increases volatility, uncertainty, complexity, and ambiguity as challenges for societies, politics, and business. This has made governance a scarce resource. It thus is inevitable that we understand the means of governance available to us and are able to economize on them. Trends like the increasing role of product labels and a certification industry as well as political movements towards nationalism and conservatism may be seen as reaction to disappointments from excessive cooperation. To avoid failures of cooperation, governance is important – control through e.g. contracts is limited and in governance economics trust is widely advertised without much guidance on its preconditions or limits.

This book draws on the rich insight from research on trust and control, and accommodates the key results for governance considerations in an institutional economics framework. It provides a view on the limits of cooperation from the required degree of governance, which can be achieved through extrinsic motivation or building on intrinsic motivation. Trust Control Economics thus inform a more realistic expectation about the net value added from cooperation by providing a balanced view including the cost of governance. It then becomes clear how complex cooperation is about ‘governance accretion’ where limited trustworthiness is substituted by control and these control instances need to be governed in turn.

Trust, Control, and the Economics of Governance is a highly necessary development of institutional economics to reflect progress made in trust research and is a relevant addition for practitioners to better understand the role of trust in the governance of contemporary cooperation-structures. It will be of interest to researchers, academics, and students in the fields of economics and business management, institutional economics, and business ethics….(More)”.

The Importance of Data Access Regimes for Artificial Intelligence and Machine Learning


JRC Digital Economy Working Paper by Bertin Martens: “Digitization triggered a steep drop in the cost of information. The resulting data glut created a bottleneck because human cognitive capacity is unable to cope with large amounts of information. Artificial intelligence and machine learning (AI/ML) triggered a similar drop in the cost of machine-based decision-making and helps in overcoming this bottleneck. Substantial change in the relative price of resources puts pressure on ownership and access rights to these resources. This explains pressure on access rights to data. ML thrives on access to big and varied datasets. We discuss the implications of access regimes for the development of AI in its current form of ML. The economic characteristics of data (non-rivalry, economies of scale and scope) favour data aggregation in big datasets. Non-rivalry implies the need for exclusive rights in order to incentivise data production when it is costly. The balance between access and exclusion is at the centre of the debate on data regimes. We explore the economic implications of several modalities for access to data, ranging from exclusive monopolistic control to monopolistic competition and free access. Regulatory intervention may push the market beyond voluntary exchanges, either towards more openness or reduced access. This may generate private costs for firms and individuals. Society can choose to do so if the social benefits of this intervention outweigh the private costs.

We briefly discuss the main EU legal instruments that are relevant for data access and ownership, including the General Data Protection Regulation (GDPR) that defines the rights of data subjects with respect to their personal data and the Database Directive (DBD) that grants ownership rights to database producers. These two instruments leave a wide legal no-man’s land where data access is ruled by bilateral contracts and Technical Protection Measures that give exclusive control to de facto data holders, and by market forces that drive access, trade and pricing of data. The absence of exclusive rights might facilitate data sharing and access or it may result in a segmented data landscape where data aggregation for ML purposes is hard to achieve. It is unclear if incompletely specified ownership and access rights maximize the welfare of society and facilitate the development of AI/ML…(More)”

Illuminating Big Data will leave governments in the dark


Robin Wigglesworth in the Financial Times: “Imagine a world where interminable waits for backward-looking, frequently-revised economic data seem as archaically quaint as floppy disks, beepers and a civil internet. This fantasy realm may be closer than you think.

The Bureau of Economic Analysis will soon publish its preliminary estimate for US economic growth in the first three months of the year, finally catching up on its regular schedule after a government shutdown paralysed the agency. But other data are still delayed, and the final official result for US gross domestic product won’t be available until July. Along the way there are likely to be many tweaks.

Collecting timely and accurate data are a Herculean task, especially for an economy as vast and varied as the US’s. But last week’s World Bank-International Monetary Fund’s annual spring meetings offered some clues on a brighter, more digital future for economic data.

The IMF hosted a series of seminars and discussions exploring how the hot new world of Big Data could be harnessed to produce more timely economic figures — and improve economic forecasts. Jiaxiong Yao, an IMF official in its African department, explained how it could use satellites to measure the intensity of night-time lights, and derive a real-time gauge of economic health.

“If a country gets brighter over time, it is growing. If it is getting darker then it probably needs an IMF programme,” he noted. Further sessions explored how the IMF could use machine learning — a popular field of artificial intelligence — to improve its influential but often faulty economic forecasts; and real-time shipping data to map global trade flows.

Sophisticated hedge funds have been mining some of these new “alternative” data sets for some time, but statistical agencies, central banks and multinational organisations such as the IMF and the World Bank are also starting to embrace the potential.

The amount of digital data around the world is already unimaginably vast. As more of our social and economic activity migrates online, the quantity and quality is going to increase exponentially. The potential is mind-boggling. Setting aside the obvious and thorny privacy issues, it is likely to lead to a revolution in the world of economic statistics. …

Yet the biggest issues are not the weaknesses of these new data sets — all statistics have inherent flaws — but their nature and location.

Firstly, it depends on the lax regulatory and personal attitudes towards personal data continuing, and there are signs of a (healthy) backlash brewing.

Secondly, almost all of this alternative data is being generated and stored in the private sector, not by government bodies such as the Bureau of Economic Analysis, Eurostat or the UK’s Office for National Statistics.

Public bodies are generally too poorly funded to buy or clean all this data themselves, meaning hedge funds will benefit from better economic data than the broader public. We might, in fact, need legislation mandating that statistical agencies receive free access to any aggregated private sector data sets that might be useful to their work.

That would ensure that our economic officials and policymakers don’t fly blind in an increasingly illuminated world….(More)”.

The Technology Fallacy: How People Are the Real Key to Digital Transformation


Book by Gerald C. Kane, Anh Nguyen Phillips, Jonathan R. Copulsky and Garth R. Andrus: “Digital technologies are disrupting organizations of every size and shape, leaving managers scrambling to find a technology fix that will help their organizations compete. This book offers managers and business leaders a guide for surviving digital disruptions—but it is not a book about technology. It is about the organizational changes required to harness the power of technology. The authors argue that digital disruption is primarily about people and that effective digital transformation involves changes to organizational dynamics and how work gets done. A focus only on selecting and implementing the right digital technologies is not likely to lead to success. The best way to respond to digital disruption is by changing the company culture to be more agile, risk tolerant, and experimental.

The authors draw on four years of research, conducted in partnership with MIT Sloan Management Review and Deloitte, surveying more than 16,000 people and conducting interviews with managers at such companies as Walmart, Google, and Salesforce. They introduce the concept of digital maturity—the ability to take advantage of opportunities offered by the new technology—and address the specifics of digital transformation, including cultivating a digital environment, enabling intentional collaboration, and fostering an experimental mindset. Every organization needs to understand its “digital DNA” in order to stop “doing digital” and start “being digital.”

Digital disruption won’t end anytime soon; the average worker will probably experience numerous waves of disruption during the course of a career. The insights offered by The Technology Fallacy will hold true through them all….(More)”.

Credit denial in the age of AI


Paper by Aaron Klein: “Banks have been in the business of deciding who is eligible for credit for centuries. But in the age of artificial intelligence (AI), machine learning (ML), and big data, digital technologies have the potential to transform credit allocation in positive as well as negative directions. Given the mix of possible societal ramifications, policymakers must consider what practices are and are not permissible and what legal and regulatory structures are necessary to protect consumers against unfair or discriminatory lending practices.

In this paper, I review the history of credit and the risks of discriminatory practices. I discuss how AI alters the dynamics of credit denials and what policymakers and banking officials can do to safeguard consumer lending. AI has the potential to alter credit practices in transformative ways and it is important to ensure that this happens in a safe and prudent manner….(More)”.

Statistics Estonia to coordinate data governance


Article by Miriam van der Sangen at CBS: “In 2018, Statistics Estonia launched a new strategy for the period 2018-2022. This strategy addresses the organisation’s aim to produce statistics more quickly while minimising the response burden on both businesses and citizens. Another element in the strategy is addressing the high expectations in Estonian society regarding the use of data. ‘We aim to transform Statistics Estonia into a national data agency,’ says Director General Mägi. ‘This means our role as a producer of official statistics will be enlarged by data governance responsibilities in the public sector. Taking on such responsibilities requires a clear vision of the whole public data ecosystem and also agreement to establish data stewards in most public sector institutions.’…

the Estonian Parliament passed new legislation that effectively expanded the number of official tasks for Statistics Estonia. Mägi elaborates: ‘Most importantly, we shall be responsible for coordinating data governance. The detailed requirements and conditions of data governance will be specified further in the coming period.’ Under the new Act, Statistics Estonia will also have more possibilities to share data with other parties….

Statistics Estonia is fully committed to producing statistics which are based on big data. Mägi explains: ‘At the moment, we are actively working on two big data projects. One project involves the use of smart electricity meters. In this project, we are looking into ways to visualise business and household electricity consumption information. The second project involves web scraping of prices and enterprise characteristics. This project is still in an initial phase, but we can already see that the use of web scraping can improve the efficiency of our production process.’ We are aiming to extend the web scraping project by also identifying e-commerce and innovation activities of enterprises.’

Yet another ambitious goal for Statistics Estonia lies in the field of data science. ‘Similarly to Statistics Netherlands, we established experimental statistics and data mining activities years ago. Last year, we developed a so-called think-tank service, providing insights from data into all aspects of our lives. Think of birth, education, employment, et cetera. Our key clients are the various ministries, municipalities and the private sector. The main aim in the coming years is to speed up service time thanks to visualisations and data lake solutions.’ …(More)”.

Unblocking the Bottlenecks and Making the Global Supply Chain Transparent: How Blockchain Technology Can Update Global Trade


Paper by Hanna C Norberg: “Blockchain technology is still in its infancy, but already it has begun to revolutionize global trade. Its lure is irresistible because of the simplicity with which it can replace the standard methods of documentation, smooth out logistics, increase transparency, speed up transactions, and ameliorate the planning and tracking of trade.

Blockchain essentially provides the supply chain with an unalterable ledger of verified transactions, and thus enables trust every step of the way through the trade process. Every stakeholder involved in that process – from producer to warehouse worker to shipper to financial institution to recipient at the final destination – can trust that the information contained in that indelible ledger is accurate. Fraud will no longer be an issue, middlemen can be eliminated, shipments tracked, quality control maintained to highest standards and consumers can make decisions based on more than the price. Blockchain dramatically reduces the amount of paperwork involved, along with the myriad of agents typically involved in the process, all of this resulting in soaring efficiencies. Making the most of this new technology, however, requires solid policy. Most people have only a vague idea of what blockchain is. There needs to be a basic understanding of what blockchain can and can’t do, and how it works in the economy and in trade. Once they become familiar with the technology, policy-makers must move on to thinking about what technological issues could be mitigated, solved or improved.

Governments need to explore blockchain’s potential through its use in public-sector projects that demonstrate its workings, its potential and its inevitable limitations. Although blockchain is not nearly as evolved now as the internet was in 2005, co-operation among all stakeholders on issues like taxonomy or policy guides on basic principles is crucial. Those stakeholders include government, industry, academia and civil society. All this must be done while keeping in mind the global nature of blockchain and that blockchain regulations need to be made in synch with regulations on other issues are adjacent to the technology, such as electronic signatures. However, work can be done in the global arena through international initiatives and organizations such as the ISO….(More)”.

Opening Internet Monopolies to Competition with Data Sharing Mandates


Policy Brief by Claudia Biancotti (PIIE) and Paolo Ciocca (Consob): “Over the past few years, it has become apparent that a small number of technology companies have assembled detailed datasets on the characteristics, preferences, and behavior of billions of individuals. This concentration of data is at the root of a worrying power imbalance between dominant internet firms and the rest of society, reflecting negatively on collective security, consumer rights, and competition. Introducing data sharing mandates, or requirements for market leaders to share user data with other firms and academia, would have a positive effect on competition. As data are a key input for artificial intelligence (AI), more widely available information would help spread the benefits of AI through the economy. On the other hand, data sharing could worsen existing risks to consumer privacy and collective security. Policymakers intending to implement a data sharing mandate should carefully evaluate this tradeoff….(More).

The Data Gaze: Capitalism, Power and Perception


Book by David Beer: “A significant new way of understanding contemporary capitalism is to understand the intensification and spread of data analytics. This text is about the powerful promises and visions that have led to the expansion of data analytics and data-led forms of social ordering. 

 It is centrally concerned with examining the types of knowledge associated with data analytics and shows that how these analytics are envisioned is central to the emergence and prominence of data at various scales of social life.  This text aims to understand the powerful role of the data analytics industry and how this industry facilitates the spread and intensification of data-led processes. As such, The Data Gaze is concerned with understanding how data-led, data-driven and data-reliant forms of capitalism pervade organisational and everyday life. 

Using a clear theoretical approach derived from Foucault and critical data studies the text develops the concept of the data gaze and shows how powerful and persuasive it is. It’s an essential and subversive guide to data analytics and data capitalism. …(More)”.

A compendium of innovation methods


Report by Geoff Mulgan and Kirsten Bound: “Featured in this compendium are just some of the innovation methods we have explored over the last decade. Some, like seed accelerator programmes, we have invested in and studied. Others, like challenge prizes, standards of evidence or public sector labs, we have developed and helped to spread around the world.

Each section gives a simple introduction to the method and describes Nesta’s work in relation to it. In each case, we have also provided links to further relevant resources and inspiration on our website and beyond.

The 13 methods featured are:

  1. Accelerator programmes
  2. Anticipatory regulation
  3. Challenge prizes
  4. Crowdfunding
  5. Experimentation
  6. Futures
  7. Impact investment
  8. Innovation mapping
  9. People Powered Results: the 100 day challenge
  10. Prototyping
  11. Public and social innovation labs
  12. Scaling grants for social innovations
  13. Standards of Evidence…(More)”.