The Practice and Potential of Blockchain Technologies for Extractive Sector Governance

Press Release: “Important questions are being raised about whether blockchain technologies can contribute to solving governance challenges in the mining, oil and gas sectors. This report seeks to begin addressing such questions, with particular reference to current blockchain applications and transparency efforts in the extractive sector.

It summarizes analysis by The Governance Lab (GovLab) at the New York University Tandon School of Engineering and the Natural Resource Governance Institute (NRGI). The study focused in particular on three activity areas: licensing and contracting, corporate registers and beneficial ownership, and commodity trading and supply chains.

Key messages:

  • Blockchain technology could potentially reduce transparency challenges and information asymmetries in certain parts of the extractives value chain. However, stakeholders considering blockchain technologies need a more nuanced understanding of problem definition, value proposition and blockchain attributes to ensure that such interventions could positively impact extractive sector governance.
  • The blockchain field currently lacks design principles, governance best practices, and open data standards that could ensure that the technology helps advance transparency and good governance in the extractive sector. Our analysis offers an initial set of design principles that could act as a starting point for a more targeted approach to the use of blockchain in improving extractives governance.
  • Most blockchain projects are preliminary concepts or pilots, with little demonstration of how to effectively scale up successful experiments, especially in countries with limited resources.
  • Meaningful impact evaluations or peer-reviewed publications that assess impact, including on the implications of blockchain’s emissions footprint, are still lacking. More broadly, a shared research agenda around blockchain could help address questions that are particularly ripe for future research.
  • Transition to a blockchain-enabled system is likely to be smoother and faster in cases when digital records are already available than when a government or company attempts to move from an analog system to one leveraging blockchain.
  • Companies or governments using blockchain are more likely to implement it successfully when they have a firm grasp of the technology, its strengths, its weaknesses, and how it fits into the broader governance landscape. But often these actors are often overly reliant on and empowering of blockchain technology vendors and startups, which can lead to “lock-in”, whereby the market gets stuck with an approach even though market participants may be better off with an alternative.
  • The role played by intermediaries like financial institutions or registrars can determine the success or failure of blockchain applications….(More)”.

Exploring Digital Government Transformation in the EU – Understanding public sector innovation in a data-driven society

Report edited by Misuraca, G., Barcevičius, E. and Codagnone, C.: “This report presents the final results of the research “Exploring Digital Government Transformation in the EU: understanding public sector innovation in a data-driven society”, in short DigiGov. After introducing the design and methodology of the study, the report provides a summary of the findings of the comprehensive analysis of the state of the art in the field, conducted reviewing a vast body of scientific literature, policy documents and practitioners generated reports in a broad range of disciplines and policy domains, with a focus on the EU. The scope and key dimensions underlying the development of the DigiGov-F conceptual framework are then presented. This is a theory-informed heuristic instrument to help mapping the effects of Digital Government Transformation and able to support defining change strategies within the institutional settings of public administration. Further, the report provides an overview of the findings of the empirical case studies conducted, and employing experimental or quasi-experimental components, to test and refine the conceptual framework proposed, while gathering evidence on impacts of Digital Government Transformation, through identifying real-life drivers and barriers in diverse Member States and policy domains. The report concludes outlining future research and policy recommendations, as well as depicting possible scenarios for future Digital Government Transformation, developed as a result of a dedicated foresight policy lab. This was conducted as part of the expert consultation and stakeholder engagement process that accompanied all the phases of the research implementation. Insights generated from the study also serve to pave the way for further empirical research and policy experimentation, and to contribute to the policy debate on how to shape Digital Europe at the horizon 2040….(More)”.

Quantified Storytelling: A Narrative Analysis of Metrics on Social Media

Book by Alex Georgakopoulou, Stefan Iversen and Carsten Stage: “This book interrogates the role of quantification in stories on social media: how do visible numbers (e.g. of views, shares, likes) and invisible algorithmic measurements shape the stories we post and engage with? The links of quantification with stories have not been explored sufficiently in storytelling research or in social media studies, despite the fact that platforms have been integrating sophisticated metrics into developing facilities for sharing stories, with a massive appeal to ordinary users, influencers and businesses alike.

With case-studies from Instagram, Reddit and Snapchat, the authors show how three types of metrics, namely content metrics, interface metrics and algorithmic metrics, affect the ways in which cancer patients share their experiences, the circulation of specific stories that mobilize counter-publics and the design of stories as facilities on platforms. The analyses document how numbers structure elements in stories, indicate and produce engagement and become resources for the tellers’ self-presentation….(More)”.

Improving data access democratizes and diversifies science

Research article by Abhishek Nagaraj, Esther Shears, and Mathijs de Vaan: “Data access is critical to empirical research, but past work on open access is largely restricted to the life sciences and has not directly analyzed the impact of data access restrictions. We analyze the impact of improved data access on the quantity, quality, and diversity of scientific research. We focus on the effects of a shift in the accessibility of satellite imagery data from Landsat, a NASA program that provides valuable remote-sensing data. Our results suggest that improved access to scientific data can lead to a large increase in the quantity and quality of scientific research. Further, better data access disproportionately enables the entry of scientists with fewer resources, and it promotes diversity of scientific research….(More)”

Research 4.0: research in the age of automation

Report by Rob Procter, Ben Glover, and Elliot Jones: “There is a growing consensus that we are at the start of a fourth industrial revolution, driven by developments in Artificial Intelligence, machine learning, robotics, the Internet of Things, 3-D printing, nanotechnology, biotechnology, 5G, new forms of energy storage and quantum computing. This report seeks to understand what impact AI is having on the UK’s research sector and what implications it has for its future, with a particular focus on academic research.

Building on our interim report, we find that AI is increasingly deployed in academic research in the UK in a broad range of disciplines. The combination of an explosion of new digital data sources with powerful new analytical tools represents a ‘double dividend’ for researchers. This is allowing researchers to investigate questions that would have been unanswerable just a decade ago. Whilst there has been considerable take-up of AI in academic research, the report highlights that steps could be taken to ensure even wider adoption of these new techniques and technologies, including wider training in the necessary skills for effective utilisation of AI, faster routes to culture change and greater multi-disciplinary collaboration.

This report recognises that the Covid-19 pandemic means universities are currently facing significant pressures, with considerable demands on their resources whilst simultaneously facing threats to income. But as we emerge from the current crisis, we urge policy makers and universities to consider the report’s recommendations and take steps to fortify the UK’s position as a place of world-leading research. Indeed, the current crisis has only reminded us of the critical importance of a highly functioning and flourishing research sector. The report recommends:

The current post-16 curriculum should be reviewed to ensure all pupils receive a grounding in basic digital, quantitative and ethical skills necessary to ensure the effective and appropriate utilisation of AI.A UK-wide audit of research computing and data infrastructure provision is conducted to consider how access might be levelled up.

UK Research and Innovation (UKRI) should consider incentivising institutions to utilise AI wherever it can offer benefits to the economy and society in their future spending on research and development.

Universities should take steps to ensure that it is easier for researchers to move between academia and industry, for example, by putting less emphasis on publications, and recognise other outputs and measures of achievement when hiring for academic posts….(More)”.

Reimagining Help

Guide by Nesta: “Now more than ever, there is a need to help people live well in their homes and communities. The coronavirus pandemic has highlighted the importance of diversifying sources of help beyond the hospital, and of drawing on support from friends, neighbours, local organisations and charities to ensure people can live healthy lives. We must think more flexibly about what ‘help’ means, and how the right help can make a huge difference.

While medical care is fundamental to saving lives, people need more than a ‘fix’ to live well every day. If we are to support people to reach their goals, we must move away from ʻexpertsʼ holding the knowledge and power, and instead draw on people’s own knowledge, relationships, strengths and purpose to determine solutions that work best for them.

We believe there is an opportunity to ‘reimagine help’ by applying insights from the field of behaviour change research to a wide range of organisations and places – community facilities, local charities and businesses, employment and housing support, as well as health and care services, all of which play a role in supporting people to reach their goals in a way that feels right for them….

Nesta, Macmillan Cancer Support, the British Heart Foundation and the UCL Centre for Behaviour Change have worked together to develop a universal model of ‘Good Help’ underpinned by behavioural evidence, which can be understood and accessed by everyone. We analysed and simplified decades of behaviour change research and practice, and worked with a group of 30 practitioners and people with lived experience to iterate and cross-check the behavioural evidence against real life experiences. Dartington Service Design Lab helped to structure and format the evidence in a way that makes it easy for everyone to understand.

Collectively, we have produced a guide which outlines eight characteristics of Good Help, which aims to support practitioners, system leaders (such as service managers, charity directors or commissioners) and any person working in a direct ‘helping’ organisation to:

  • Understand the behaviour change evidence that underpins Good Help
  • Develop new ideas or adapt offers of Good Help, which can be tested out in their own organisations or local communities….(More)”.

Digital Minilateralism: How governments cooperate on digital governance

A policy paper by Tanya Filer and Antonio Weiss: “New research from the Digital State Project argues for the critical function of small, agile, digitally enabled and focused networks of leaders to foster strong international cooperation on digital governance issues.

This type of cooperative working, described as ‘digital minilateralism’, has a role to play in shaping how individual governments learn, adopt and govern the use of new and emerging technologies, and how they create common or aligned policies. It is also important as cross-border digital infrastructure and services become increasingly common….

Key findings: 

  • Already beginning to prove effective, digital minilateralism has a role to play in shaping how individual governments learn, adopt and govern the use of new and emerging technologies, and how they create common or aligned policy.
  • National governments should recognise and reinforce the strategic value of digital minilaterals without stamping out, through over-bureaucratisation, the qualities of trust, open conversation, and ad-hocness in which their value lies.
  • As digital minilateral networks grow and mature, they will need to find mechanisms through which to retain (or adapt) their core principles while scaling across more boundaries.
  • To demonstrate their value to the global community, digital multilaterals must feed into formal multilateral conversations and arrangements. …(More)“.

Sortition, its advocates and its critics: An empirical analysis of citizens’ and MPs’ support for random selection as a democratic reform proposal

Paper by Vincent Jacquet et al: “This article explores the prospects of an increasingly debated democratic reform: assigning political offices by lot. While this idea is advocated by political theorists and politicians in favour of participatory and deliberative democracy, the article investigates the extent to which citizens and MPs actually endorse different variants of ‘sortition’. We test for differences among respondents’ social status, disaffection with elections and political ideology. Our findings suggest that MPs are largely opposed to sortitioning political offices when their decision-making power is more than consultative, although leftist MPs tend to be in favour of mixed assemblies (involving elected and sortitioned members). Among citizens, random selection seems to appeal above all to disaffected individuals with a lower social status. The article ends with a discussion of the political prospects of sortition being introduced as a democratic reform…(More).”

The Cruel New Era of Data-Driven Deportation

Article by Alvaro M. Bedoya: “For a long time, mass deportations were a small-data affair, driven by tips, one-off investigations, or animus-driven hunches. But beginning under George W. Bush, and expanding under Barack Obama, ICE leadership started to reap the benefits of Big Data. The centerpiece of that shift was the “Secure Communities” program, which gathered the fingerprints of arrestees at local and state jails across the nation and compared them with immigration records. That program quickly became a major driver for interior deportations. But ICE wanted more data. The agency had long tapped into driver address records through law enforcement networks. Eyeing the breadth of DMV databases, agents began to ask state officials to run face recognition searches on driver photos against the photos of undocumented people. In Utah, for example, ICE officers requested hundreds of face searches starting in late 2015. Many immigrants avoid contact with any government agency, even the DMV, but they can’t go without heat, electricity, or water; ICE aimed to find them, too. So, that same year, ICE paid for access to a private database that includes the addresses of customers from 80 national and regional electric, cable, gas, and telephone companies.

Amid this bonanza, at least, the Obama administration still acknowledged red lines. Some data were too invasive, some uses too immoral. Under Donald Trump, these limits fell away.

In 2017, breaking with prior practice, ICE started to use data from interviews with scared, detained kids and their relatives to find and arrest more than 500 sponsors who stepped forward to take in the children. At the same time, ICE announced a plan for a social media monitoring program that would use artificial intelligence to automatically flag 10,000 people per month for deportation investigations. (It was scuttled only when computer scientists helpfully indicated that the proposed system was impossible.) The next year, ICE secured access to 5 billion license plate scans from public parking lots and roadways, a hoard that tracks the drives of 60 percent of Americans—an initiative blocked by Department of Homeland Security leadership four years earlier. In August, the agency cut a deal with Clearview AI, whose technology identifies people by comparing their faces not to millions of driver photos, but to 3 billion images from social media and other sites. This is a new era of immigrant surveillance: ICE has transformed from an agency that tracks some people sometimes to an agency that can track anyone at any time….(More)”.

AI planners in Minecraft could help machines design better cities

Article by Will Douglas Heaven: “A dozen or so steep-roofed buildings cling to the edges of an open-pit mine. High above them, on top of an enormous rock arch, sits an inaccessible house. Elsewhere, a railway on stilts circles a group of multicolored tower blocks. Ornate pagodas decorate a large paved plaza. And a lone windmill turns on an island, surrounded by square pigs. This is Minecraft city-building, AI style.

Minecraft has long been a canvas for wild invention. Fans have used the hit block-building game to create replicas of everything from downtown Chicago and King’s Landing to working CPUs. In the decade since its first release, anything that can be built has been.

Since 2018, Minecraft has also been the setting for a creative challenge that stretches the abilities of machines. The annual Generative Design in Minecraft (GDMC) competition asks participants to build an artificial intelligence that can generate realistic towns or villages in previously unseen locations. The contest is just for fun, for now, but the techniques explored by the various AI competitors are precursors of ones that real-world city planners could use….(More)”.